Jan 28 01:19:15.777850 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Jan 27 22:22:24 -00 2026 Jan 28 01:19:15.777875 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=71544b7bf64a92b2aba342c16b083723a12bedf106d3ddb24ccb63046196f1b3 Jan 28 01:19:15.777886 kernel: BIOS-provided physical RAM map: Jan 28 01:19:15.777893 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 28 01:19:15.777899 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Jan 28 01:19:15.777906 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000044fdfff] usable Jan 28 01:19:15.777914 kernel: BIOS-e820: [mem 0x00000000044fe000-0x00000000048fdfff] reserved Jan 28 01:19:15.777920 kernel: BIOS-e820: [mem 0x00000000048fe000-0x000000003ff1efff] usable Jan 28 01:19:15.777927 kernel: BIOS-e820: [mem 0x000000003ff1f000-0x000000003ffc8fff] reserved Jan 28 01:19:15.777935 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Jan 28 01:19:15.777942 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Jan 28 01:19:15.777976 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Jan 28 01:19:15.777983 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Jan 28 01:19:15.777990 kernel: printk: legacy bootconsole [earlyser0] enabled Jan 28 01:19:15.777998 kernel: NX (Execute Disable) protection: active Jan 28 01:19:15.778007 kernel: APIC: Static calls initialized Jan 28 01:19:15.778014 kernel: efi: EFI v2.7 by Microsoft Jan 28 01:19:15.778021 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff88000 SMBIOS 3.0=0x3ff86000 MEMATTR=0x3e99f698 RNG=0x3ffd2018 Jan 28 01:19:15.778028 kernel: random: crng init done Jan 28 01:19:15.778035 kernel: secureboot: Secure boot disabled Jan 28 01:19:15.778042 kernel: SMBIOS 3.1.0 present. Jan 28 01:19:15.778050 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 07/25/2025 Jan 28 01:19:15.778057 kernel: DMI: Memory slots populated: 2/2 Jan 28 01:19:15.778064 kernel: Hypervisor detected: Microsoft Hyper-V Jan 28 01:19:15.778071 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 Jan 28 01:19:15.778080 kernel: Hyper-V: Nested features: 0x3e0101 Jan 28 01:19:15.778087 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Jan 28 01:19:15.778094 kernel: Hyper-V: Using hypercall for remote TLB flush Jan 28 01:19:15.778101 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Jan 28 01:19:15.778108 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Jan 28 01:19:15.778115 kernel: tsc: Detected 2300.000 MHz processor Jan 28 01:19:15.778122 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 28 01:19:15.778131 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 28 01:19:15.778138 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 Jan 28 01:19:15.778147 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 28 01:19:15.778155 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 28 01:19:15.778162 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved Jan 28 01:19:15.778170 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 Jan 28 01:19:15.778177 kernel: Using GB pages for direct mapping Jan 28 01:19:15.778185 kernel: ACPI: Early table checksum verification disabled Jan 28 01:19:15.778197 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Jan 28 01:19:15.778205 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jan 28 01:19:15.778213 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Jan 28 01:19:15.778221 kernel: ACPI: DSDT 0x000000003FFD6000 01E22B (v02 MSFTVM DSDT01 00000001 INTL 20230628) Jan 28 01:19:15.778228 kernel: ACPI: FACS 0x000000003FFFE000 000040 Jan 28 01:19:15.778237 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jan 28 01:19:15.778246 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Jan 28 01:19:15.778254 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jan 28 01:19:15.778262 kernel: ACPI: APIC 0x000000003FFD5000 000058 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) Jan 28 01:19:15.778270 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) Jan 28 01:19:15.778278 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jan 28 01:19:15.778286 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Jan 28 01:19:15.778296 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff422a] Jan 28 01:19:15.778303 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Jan 28 01:19:15.778312 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Jan 28 01:19:15.778320 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Jan 28 01:19:15.778328 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Jan 28 01:19:15.778336 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5057] Jan 28 01:19:15.778344 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] Jan 28 01:19:15.778354 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Jan 28 01:19:15.778362 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Jan 28 01:19:15.778370 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] Jan 28 01:19:15.778378 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] Jan 28 01:19:15.778385 kernel: NODE_DATA(0) allocated [mem 0x2bfff6dc0-0x2bfffdfff] Jan 28 01:19:15.778393 kernel: Zone ranges: Jan 28 01:19:15.778401 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 28 01:19:15.778410 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 28 01:19:15.778418 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Jan 28 01:19:15.778426 kernel: Device empty Jan 28 01:19:15.778433 kernel: Movable zone start for each node Jan 28 01:19:15.778441 kernel: Early memory node ranges Jan 28 01:19:15.778449 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 28 01:19:15.778457 kernel: node 0: [mem 0x0000000000100000-0x00000000044fdfff] Jan 28 01:19:15.778466 kernel: node 0: [mem 0x00000000048fe000-0x000000003ff1efff] Jan 28 01:19:15.778473 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Jan 28 01:19:15.778481 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Jan 28 01:19:15.778489 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Jan 28 01:19:15.778497 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 28 01:19:15.778504 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 28 01:19:15.778512 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Jan 28 01:19:15.778521 kernel: On node 0, zone DMA32: 224 pages in unavailable ranges Jan 28 01:19:15.778529 kernel: ACPI: PM-Timer IO Port: 0x408 Jan 28 01:19:15.778537 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] dfl dfl lint[0x1]) Jan 28 01:19:15.778545 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 28 01:19:15.778552 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 28 01:19:15.778560 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 28 01:19:15.778568 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Jan 28 01:19:15.778575 kernel: TSC deadline timer available Jan 28 01:19:15.778585 kernel: CPU topo: Max. logical packages: 1 Jan 28 01:19:15.778592 kernel: CPU topo: Max. logical dies: 1 Jan 28 01:19:15.778600 kernel: CPU topo: Max. dies per package: 1 Jan 28 01:19:15.778607 kernel: CPU topo: Max. threads per core: 2 Jan 28 01:19:15.778615 kernel: CPU topo: Num. cores per package: 1 Jan 28 01:19:15.778622 kernel: CPU topo: Num. threads per package: 2 Jan 28 01:19:15.778630 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 28 01:19:15.778640 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Jan 28 01:19:15.778648 kernel: Booting paravirtualized kernel on Hyper-V Jan 28 01:19:15.778656 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 28 01:19:15.778664 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 28 01:19:15.778672 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 28 01:19:15.778680 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 28 01:19:15.778687 kernel: pcpu-alloc: [0] 0 1 Jan 28 01:19:15.778697 kernel: Hyper-V: PV spinlocks enabled Jan 28 01:19:15.778704 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 28 01:19:15.778713 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=71544b7bf64a92b2aba342c16b083723a12bedf106d3ddb24ccb63046196f1b3 Jan 28 01:19:15.778722 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 28 01:19:15.778730 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 28 01:19:15.778737 kernel: Fallback order for Node 0: 0 Jan 28 01:19:15.778747 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2095807 Jan 28 01:19:15.778754 kernel: Policy zone: Normal Jan 28 01:19:15.778763 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 28 01:19:15.778770 kernel: software IO TLB: area num 2. Jan 28 01:19:15.778778 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 28 01:19:15.778786 kernel: ftrace: allocating 40128 entries in 157 pages Jan 28 01:19:15.778793 kernel: ftrace: allocated 157 pages with 5 groups Jan 28 01:19:15.778801 kernel: Dynamic Preempt: voluntary Jan 28 01:19:15.778810 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 28 01:19:15.778821 kernel: rcu: RCU event tracing is enabled. Jan 28 01:19:15.778835 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 28 01:19:15.778845 kernel: Trampoline variant of Tasks RCU enabled. Jan 28 01:19:15.778854 kernel: Rude variant of Tasks RCU enabled. Jan 28 01:19:15.778862 kernel: Tracing variant of Tasks RCU enabled. Jan 28 01:19:15.778870 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 28 01:19:15.778879 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 28 01:19:15.778887 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 28 01:19:15.778897 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 28 01:19:15.778906 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 28 01:19:15.778914 kernel: Using NULL legacy PIC Jan 28 01:19:15.778923 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Jan 28 01:19:15.778932 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 28 01:19:15.778940 kernel: Console: colour dummy device 80x25 Jan 28 01:19:15.778958 kernel: printk: legacy console [tty1] enabled Jan 28 01:19:15.778966 kernel: printk: legacy console [ttyS0] enabled Jan 28 01:19:15.778975 kernel: printk: legacy bootconsole [earlyser0] disabled Jan 28 01:19:15.778983 kernel: ACPI: Core revision 20240827 Jan 28 01:19:15.778991 kernel: Failed to register legacy timer interrupt Jan 28 01:19:15.779001 kernel: APIC: Switch to symmetric I/O mode setup Jan 28 01:19:15.779010 kernel: x2apic enabled Jan 28 01:19:15.779018 kernel: APIC: Switched APIC routing to: physical x2apic Jan 28 01:19:15.779026 kernel: Hyper-V: Host Build 10.0.26100.1448-1-0 Jan 28 01:19:15.779035 kernel: Hyper-V: enabling crash_kexec_post_notifiers Jan 28 01:19:15.779043 kernel: Hyper-V: Disabling IBT because of Hyper-V bug Jan 28 01:19:15.779052 kernel: Hyper-V: Using IPI hypercalls Jan 28 01:19:15.779062 kernel: APIC: send_IPI() replaced with hv_send_ipi() Jan 28 01:19:15.779070 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Jan 28 01:19:15.779079 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Jan 28 01:19:15.779088 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Jan 28 01:19:15.779096 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Jan 28 01:19:15.779104 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Jan 28 01:19:15.779113 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212735223b2, max_idle_ns: 440795277976 ns Jan 28 01:19:15.779123 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4600.00 BogoMIPS (lpj=2300000) Jan 28 01:19:15.779131 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 28 01:19:15.779140 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 28 01:19:15.779148 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 28 01:19:15.779156 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 28 01:19:15.779164 kernel: Spectre V2 : Mitigation: Retpolines Jan 28 01:19:15.779172 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jan 28 01:19:15.779180 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Jan 28 01:19:15.779190 kernel: RETBleed: Vulnerable Jan 28 01:19:15.779198 kernel: Speculative Store Bypass: Vulnerable Jan 28 01:19:15.779206 kernel: active return thunk: its_return_thunk Jan 28 01:19:15.779214 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 28 01:19:15.779221 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 28 01:19:15.779229 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 28 01:19:15.779237 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 28 01:19:15.779245 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 28 01:19:15.779253 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 28 01:19:15.779261 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 28 01:19:15.779271 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' Jan 28 01:19:15.779279 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' Jan 28 01:19:15.779287 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' Jan 28 01:19:15.779295 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 28 01:19:15.779303 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 28 01:19:15.779311 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 28 01:19:15.779318 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 28 01:19:15.779326 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 Jan 28 01:19:15.779334 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 Jan 28 01:19:15.779342 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 Jan 28 01:19:15.779350 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. Jan 28 01:19:15.779359 kernel: Freeing SMP alternatives memory: 32K Jan 28 01:19:15.779367 kernel: pid_max: default: 32768 minimum: 301 Jan 28 01:19:15.779375 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 28 01:19:15.779383 kernel: landlock: Up and running. Jan 28 01:19:15.779391 kernel: SELinux: Initializing. Jan 28 01:19:15.779399 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 28 01:19:15.779407 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 28 01:19:15.779415 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) Jan 28 01:19:15.779423 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. Jan 28 01:19:15.779432 kernel: signal: max sigframe size: 11952 Jan 28 01:19:15.779441 kernel: rcu: Hierarchical SRCU implementation. Jan 28 01:19:15.779450 kernel: rcu: Max phase no-delay instances is 400. Jan 28 01:19:15.779458 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 28 01:19:15.779467 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 28 01:19:15.779475 kernel: smp: Bringing up secondary CPUs ... Jan 28 01:19:15.779484 kernel: smpboot: x86: Booting SMP configuration: Jan 28 01:19:15.779492 kernel: .... node #0, CPUs: #1 Jan 28 01:19:15.779500 kernel: smp: Brought up 1 node, 2 CPUs Jan 28 01:19:15.779510 kernel: smpboot: Total of 2 processors activated (9200.00 BogoMIPS) Jan 28 01:19:15.779519 kernel: Memory: 8093656K/8383228K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15536K init, 2500K bss, 283612K reserved, 0K cma-reserved) Jan 28 01:19:15.779527 kernel: devtmpfs: initialized Jan 28 01:19:15.779535 kernel: x86/mm: Memory block size: 128MB Jan 28 01:19:15.779543 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Jan 28 01:19:15.779551 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 28 01:19:15.779558 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 28 01:19:15.779565 kernel: pinctrl core: initialized pinctrl subsystem Jan 28 01:19:15.779576 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 28 01:19:15.779589 kernel: audit: initializing netlink subsys (disabled) Jan 28 01:19:15.779604 kernel: audit: type=2000 audit(1769563150.107:1): state=initialized audit_enabled=0 res=1 Jan 28 01:19:15.779618 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 28 01:19:15.779633 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 28 01:19:15.779648 kernel: cpuidle: using governor menu Jan 28 01:19:15.779666 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 28 01:19:15.779675 kernel: dca service started, version 1.12.1 Jan 28 01:19:15.779683 kernel: e820: reserve RAM buffer [mem 0x044fe000-0x07ffffff] Jan 28 01:19:15.779691 kernel: e820: reserve RAM buffer [mem 0x3ff1f000-0x3fffffff] Jan 28 01:19:15.779700 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 28 01:19:15.779708 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 28 01:19:15.779716 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 28 01:19:15.779726 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 28 01:19:15.779734 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 28 01:19:15.779742 kernel: ACPI: Added _OSI(Module Device) Jan 28 01:19:15.779750 kernel: ACPI: Added _OSI(Processor Device) Jan 28 01:19:15.779758 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 28 01:19:15.779766 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 28 01:19:15.779774 kernel: ACPI: Interpreter enabled Jan 28 01:19:15.779783 kernel: ACPI: PM: (supports S0 S5) Jan 28 01:19:15.779791 kernel: ACPI: Using IOAPIC for interrupt routing Jan 28 01:19:15.779799 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 28 01:19:15.779807 kernel: PCI: Ignoring E820 reservations for host bridge windows Jan 28 01:19:15.779815 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Jan 28 01:19:15.779823 kernel: iommu: Default domain type: Translated Jan 28 01:19:15.779831 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 28 01:19:15.779841 kernel: efivars: Registered efivars operations Jan 28 01:19:15.779849 kernel: PCI: Using ACPI for IRQ routing Jan 28 01:19:15.779857 kernel: PCI: System does not support PCI Jan 28 01:19:15.779865 kernel: vgaarb: loaded Jan 28 01:19:15.779873 kernel: clocksource: Switched to clocksource tsc-early Jan 28 01:19:15.779882 kernel: VFS: Disk quotas dquot_6.6.0 Jan 28 01:19:15.779889 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 28 01:19:15.779900 kernel: pnp: PnP ACPI init Jan 28 01:19:15.779908 kernel: pnp: PnP ACPI: found 3 devices Jan 28 01:19:15.779917 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 28 01:19:15.779925 kernel: NET: Registered PF_INET protocol family Jan 28 01:19:15.779933 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 28 01:19:15.779941 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Jan 28 01:19:15.779968 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 28 01:19:15.779978 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 28 01:19:15.779987 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 28 01:19:15.779995 kernel: TCP: Hash tables configured (established 65536 bind 65536) Jan 28 01:19:15.780004 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Jan 28 01:19:15.780013 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Jan 28 01:19:15.780022 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 28 01:19:15.780030 kernel: NET: Registered PF_XDP protocol family Jan 28 01:19:15.780040 kernel: PCI: CLS 0 bytes, default 64 Jan 28 01:19:15.780048 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 28 01:19:15.780057 kernel: software IO TLB: mapped [mem 0x000000003a99f000-0x000000003e99f000] (64MB) Jan 28 01:19:15.780065 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer Jan 28 01:19:15.780073 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules Jan 28 01:19:15.780081 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212735223b2, max_idle_ns: 440795277976 ns Jan 28 01:19:15.780089 kernel: clocksource: Switched to clocksource tsc Jan 28 01:19:15.780099 kernel: Initialise system trusted keyrings Jan 28 01:19:15.780107 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Jan 28 01:19:15.780115 kernel: Key type asymmetric registered Jan 28 01:19:15.780123 kernel: Asymmetric key parser 'x509' registered Jan 28 01:19:15.780131 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 28 01:19:15.780140 kernel: io scheduler mq-deadline registered Jan 28 01:19:15.780149 kernel: io scheduler kyber registered Jan 28 01:19:15.780160 kernel: io scheduler bfq registered Jan 28 01:19:15.780168 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 28 01:19:15.780176 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 28 01:19:15.780184 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 28 01:19:15.780192 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Jan 28 01:19:15.780200 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A Jan 28 01:19:15.780209 kernel: i8042: PNP: No PS/2 controller found. Jan 28 01:19:15.780365 kernel: rtc_cmos 00:02: registered as rtc0 Jan 28 01:19:15.780458 kernel: rtc_cmos 00:02: setting system clock to 2026-01-28T01:19:12 UTC (1769563152) Jan 28 01:19:15.780545 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Jan 28 01:19:15.780554 kernel: intel_pstate: Intel P-state driver initializing Jan 28 01:19:15.780563 kernel: efifb: probing for efifb Jan 28 01:19:15.780571 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Jan 28 01:19:15.780581 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Jan 28 01:19:15.780588 kernel: efifb: scrolling: redraw Jan 28 01:19:15.780597 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 28 01:19:15.780605 kernel: Console: switching to colour frame buffer device 128x48 Jan 28 01:19:15.780613 kernel: fb0: EFI VGA frame buffer device Jan 28 01:19:15.780621 kernel: pstore: Using crash dump compression: deflate Jan 28 01:19:15.780629 kernel: pstore: Registered efi_pstore as persistent store backend Jan 28 01:19:15.780639 kernel: NET: Registered PF_INET6 protocol family Jan 28 01:19:15.780647 kernel: Segment Routing with IPv6 Jan 28 01:19:15.780655 kernel: In-situ OAM (IOAM) with IPv6 Jan 28 01:19:15.780663 kernel: NET: Registered PF_PACKET protocol family Jan 28 01:19:15.780671 kernel: Key type dns_resolver registered Jan 28 01:19:15.780679 kernel: IPI shorthand broadcast: enabled Jan 28 01:19:15.780687 kernel: sched_clock: Marking stable (1807004141, 78991026)->(2165672833, -279677666) Jan 28 01:19:15.780695 kernel: registered taskstats version 1 Jan 28 01:19:15.780704 kernel: Loading compiled-in X.509 certificates Jan 28 01:19:15.780713 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 0eb3c2aae9988d4ab7f0e142c4f5c61453c9ddb3' Jan 28 01:19:15.780721 kernel: Demotion targets for Node 0: null Jan 28 01:19:15.780729 kernel: Key type .fscrypt registered Jan 28 01:19:15.780736 kernel: Key type fscrypt-provisioning registered Jan 28 01:19:15.780744 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 28 01:19:15.780752 kernel: ima: Allocated hash algorithm: sha1 Jan 28 01:19:15.780762 kernel: ima: No architecture policies found Jan 28 01:19:15.780770 kernel: clk: Disabling unused clocks Jan 28 01:19:15.780778 kernel: Freeing unused kernel image (initmem) memory: 15536K Jan 28 01:19:15.780786 kernel: Write protecting the kernel read-only data: 47104k Jan 28 01:19:15.780794 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 28 01:19:15.780802 kernel: Run /init as init process Jan 28 01:19:15.780810 kernel: with arguments: Jan 28 01:19:15.780820 kernel: /init Jan 28 01:19:15.780828 kernel: with environment: Jan 28 01:19:15.780835 kernel: HOME=/ Jan 28 01:19:15.780843 kernel: TERM=linux Jan 28 01:19:15.780851 kernel: hv_vmbus: Vmbus version:5.3 Jan 28 01:19:15.780859 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 28 01:19:15.780867 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 28 01:19:15.780875 kernel: PTP clock support registered Jan 28 01:19:15.780885 kernel: hv_utils: Registering HyperV Utility Driver Jan 28 01:19:15.780893 kernel: hv_vmbus: registering driver hv_utils Jan 28 01:19:15.780901 kernel: hv_utils: Shutdown IC version 3.2 Jan 28 01:19:15.780909 kernel: hv_utils: Heartbeat IC version 3.0 Jan 28 01:19:15.780917 kernel: hv_utils: TimeSync IC version 4.0 Jan 28 01:19:15.780925 kernel: SCSI subsystem initialized Jan 28 01:19:15.780933 kernel: hv_vmbus: registering driver hv_pci Jan 28 01:19:15.781072 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 Jan 28 01:19:15.781172 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 Jan 28 01:19:15.781285 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] Jan 28 01:19:15.781382 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] Jan 28 01:19:15.781504 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint Jan 28 01:19:15.781613 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] Jan 28 01:19:15.781712 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 Jan 28 01:19:15.781818 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned Jan 28 01:19:15.781827 kernel: hv_vmbus: registering driver hv_storvsc Jan 28 01:19:15.781937 kernel: scsi host0: storvsc_host_t Jan 28 01:19:15.782066 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Jan 28 01:19:15.782077 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 28 01:19:15.782085 kernel: hv_vmbus: registering driver hid_hyperv Jan 28 01:19:15.782093 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Jan 28 01:19:15.782195 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Jan 28 01:19:15.782206 kernel: hv_vmbus: registering driver hyperv_keyboard Jan 28 01:19:15.782217 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Jan 28 01:19:15.782323 kernel: nvme nvme0: pci function c05b:00:00.0 Jan 28 01:19:15.782436 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) Jan 28 01:19:15.782514 kernel: nvme nvme0: 2/0/0 default/read/poll queues Jan 28 01:19:15.782525 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Jan 28 01:19:15.782631 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Jan 28 01:19:15.782641 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 28 01:19:15.782743 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Jan 28 01:19:15.782788 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 28 01:19:15.782797 kernel: device-mapper: uevent: version 1.0.3 Jan 28 01:19:15.782805 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 28 01:19:15.782813 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 28 01:19:15.782834 kernel: raid6: avx512x4 gen() 46201 MB/s Jan 28 01:19:15.782844 kernel: raid6: avx512x2 gen() 45114 MB/s Jan 28 01:19:15.782852 kernel: raid6: avx512x1 gen() 29838 MB/s Jan 28 01:19:15.782861 kernel: raid6: avx2x4 gen() 41844 MB/s Jan 28 01:19:15.782869 kernel: raid6: avx2x2 gen() 43166 MB/s Jan 28 01:19:15.782877 kernel: raid6: avx2x1 gen() 29704 MB/s Jan 28 01:19:15.782886 kernel: raid6: using algorithm avx512x4 gen() 46201 MB/s Jan 28 01:19:15.782895 kernel: raid6: .... xor() 7817 MB/s, rmw enabled Jan 28 01:19:15.782904 kernel: raid6: using avx512x2 recovery algorithm Jan 28 01:19:15.782912 kernel: xor: automatically using best checksumming function avx Jan 28 01:19:15.782921 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 28 01:19:15.782929 kernel: BTRFS: device fsid 0f5fa021-4357-40bb-b32a-e1579c5824ad devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (969) Jan 28 01:19:15.782938 kernel: BTRFS info (device dm-0): first mount of filesystem 0f5fa021-4357-40bb-b32a-e1579c5824ad Jan 28 01:19:15.782946 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:19:15.782970 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 28 01:19:15.782979 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 28 01:19:15.782988 kernel: BTRFS info (device dm-0): enabling free space tree Jan 28 01:19:15.782996 kernel: loop: module loaded Jan 28 01:19:15.783004 kernel: loop0: detected capacity change from 0 to 100552 Jan 28 01:19:15.783013 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 28 01:19:15.783022 systemd[1]: Successfully made /usr/ read-only. Jan 28 01:19:15.783035 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 28 01:19:15.783045 systemd[1]: Detected virtualization microsoft. Jan 28 01:19:15.783053 systemd[1]: Detected architecture x86-64. Jan 28 01:19:15.783062 systemd[1]: Running in initrd. Jan 28 01:19:15.783070 systemd[1]: No hostname configured, using default hostname. Jan 28 01:19:15.783079 systemd[1]: Hostname set to . Jan 28 01:19:15.783090 systemd[1]: Initializing machine ID from random generator. Jan 28 01:19:15.783098 systemd[1]: Queued start job for default target initrd.target. Jan 28 01:19:15.783107 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 28 01:19:15.783117 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 01:19:15.783125 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 01:19:15.783135 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 28 01:19:15.783146 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 28 01:19:15.783155 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 28 01:19:15.783164 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 28 01:19:15.783173 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 01:19:15.783183 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 28 01:19:15.783193 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 28 01:19:15.783202 systemd[1]: Reached target paths.target - Path Units. Jan 28 01:19:15.783212 systemd[1]: Reached target slices.target - Slice Units. Jan 28 01:19:15.783220 systemd[1]: Reached target swap.target - Swaps. Jan 28 01:19:15.783229 systemd[1]: Reached target timers.target - Timer Units. Jan 28 01:19:15.783241 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 28 01:19:15.783250 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 28 01:19:15.783259 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 28 01:19:15.783267 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 28 01:19:15.783276 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 28 01:19:15.783285 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 28 01:19:15.783294 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 28 01:19:15.783304 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 01:19:15.783313 systemd[1]: Reached target sockets.target - Socket Units. Jan 28 01:19:15.783322 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 28 01:19:15.783331 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 28 01:19:15.783340 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 28 01:19:15.783349 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 28 01:19:15.783358 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 28 01:19:15.783368 systemd[1]: Starting systemd-fsck-usr.service... Jan 28 01:19:15.783377 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 28 01:19:15.783386 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 28 01:19:15.783395 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:19:15.783406 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 28 01:19:15.783414 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 01:19:15.783423 systemd[1]: Finished systemd-fsck-usr.service. Jan 28 01:19:15.783432 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 28 01:19:15.783455 systemd-journald[1106]: Collecting audit messages is enabled. Jan 28 01:19:15.783479 systemd-journald[1106]: Journal started Jan 28 01:19:15.783499 systemd-journald[1106]: Runtime Journal (/run/log/journal/8eaddb2441374b839cd43c9a63f95cc3) is 8M, max 158.5M, 150.5M free. Jan 28 01:19:15.786786 systemd[1]: Started systemd-journald.service - Journal Service. Jan 28 01:19:15.785000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.792710 kernel: audit: type=1130 audit(1769563155.785:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.790817 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 28 01:19:15.801971 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 28 01:19:15.822359 systemd-modules-load[1109]: Inserted module 'br_netfilter' Jan 28 01:19:15.829220 kernel: Bridge firewalling registered Jan 28 01:19:15.829244 kernel: audit: type=1130 audit(1769563155.822:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.822000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.823004 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 28 01:19:15.827605 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 28 01:19:15.863900 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 28 01:19:15.865380 systemd-tmpfiles[1119]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 28 01:19:15.871000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.876725 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:19:15.885025 kernel: audit: type=1130 audit(1769563155.871:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.885046 kernel: audit: type=1130 audit(1769563155.876:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.876000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.882192 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 28 01:19:15.887000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.890980 kernel: audit: type=1130 audit(1769563155.887:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.891029 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 01:19:15.892000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.896971 kernel: audit: type=1130 audit(1769563155.892:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.898076 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 28 01:19:15.901000 audit: BPF prog-id=6 op=LOAD Jan 28 01:19:15.903969 kernel: audit: type=1334 audit(1769563155.901:8): prog-id=6 op=LOAD Jan 28 01:19:15.904074 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 28 01:19:15.909090 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 28 01:19:15.924346 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 28 01:19:15.932084 kernel: audit: type=1130 audit(1769563155.925:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.925000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.934058 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 28 01:19:15.938468 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 01:19:15.949009 kernel: audit: type=1130 audit(1769563155.939:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:15.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.007905 dracut-cmdline[1144]: dracut-109 Jan 28 01:19:16.007941 systemd-resolved[1130]: Positive Trust Anchors: Jan 28 01:19:16.008288 systemd-resolved[1130]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 28 01:19:16.008296 systemd-resolved[1130]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 28 01:19:16.008329 systemd-resolved[1130]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 28 01:19:16.036000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.033052 systemd-resolved[1130]: Defaulting to hostname 'linux'. Jan 28 01:19:16.054050 dracut-cmdline[1144]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=71544b7bf64a92b2aba342c16b083723a12bedf106d3ddb24ccb63046196f1b3 Jan 28 01:19:16.033751 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 28 01:19:16.037106 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 28 01:19:16.155970 kernel: Loading iSCSI transport class v2.0-870. Jan 28 01:19:16.211966 kernel: iscsi: registered transport (tcp) Jan 28 01:19:16.259984 kernel: iscsi: registered transport (qla4xxx) Jan 28 01:19:16.260028 kernel: QLogic iSCSI HBA Driver Jan 28 01:19:16.304346 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 28 01:19:16.318931 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 01:19:16.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.324155 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 28 01:19:16.355025 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 28 01:19:16.358875 kernel: kauditd_printk_skb: 2 callbacks suppressed Jan 28 01:19:16.358920 kernel: audit: type=1130 audit(1769563156.356:13): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.356000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.360274 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 28 01:19:16.367067 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 28 01:19:16.387899 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 28 01:19:16.392000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.396054 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 01:19:16.402025 kernel: audit: type=1130 audit(1769563156.392:14): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.402053 kernel: audit: type=1334 audit(1769563156.393:15): prog-id=7 op=LOAD Jan 28 01:19:16.402064 kernel: audit: type=1334 audit(1769563156.393:16): prog-id=8 op=LOAD Jan 28 01:19:16.393000 audit: BPF prog-id=7 op=LOAD Jan 28 01:19:16.393000 audit: BPF prog-id=8 op=LOAD Jan 28 01:19:16.427419 systemd-udevd[1381]: Using default interface naming scheme 'v257'. Jan 28 01:19:16.439169 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 01:19:16.447543 kernel: audit: type=1130 audit(1769563156.441:17): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.441000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.446867 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 28 01:19:16.456759 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 28 01:19:16.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.464214 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 28 01:19:16.469042 kernel: audit: type=1130 audit(1769563156.460:18): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.469066 kernel: audit: type=1334 audit(1769563156.462:19): prog-id=9 op=LOAD Jan 28 01:19:16.462000 audit: BPF prog-id=9 op=LOAD Jan 28 01:19:16.473703 dracut-pre-trigger[1466]: rd.md=0: removing MD RAID activation Jan 28 01:19:16.495710 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 28 01:19:16.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.502088 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 28 01:19:16.505114 kernel: audit: type=1130 audit(1769563156.497:20): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.519245 systemd-networkd[1476]: lo: Link UP Jan 28 01:19:16.527030 kernel: audit: type=1130 audit(1769563156.520:21): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.520000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.519254 systemd-networkd[1476]: lo: Gained carrier Jan 28 01:19:16.519764 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 28 01:19:16.521087 systemd[1]: Reached target network.target - Network. Jan 28 01:19:16.546493 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 01:19:16.555060 kernel: audit: type=1130 audit(1769563156.547:22): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.547000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.553165 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 28 01:19:16.632006 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#99 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Jan 28 01:19:16.633966 kernel: hv_vmbus: registering driver hv_netvsc Jan 28 01:19:16.704723 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:19:16.706633 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:19:16.711202 kernel: cryptd: max_cpu_qlen set to 1000 Jan 28 01:19:16.713834 systemd-networkd[1476]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:19:16.714000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.713845 systemd-networkd[1476]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 28 01:19:16.718120 systemd-networkd[1476]: eth0: Link UP Jan 28 01:19:16.718168 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:19:16.718268 systemd-networkd[1476]: eth0: Gained carrier Jan 28 01:19:16.718280 systemd-networkd[1476]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:19:16.722714 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:19:16.731662 systemd-networkd[1476]: eth0: DHCPv4 address 10.200.8.20/24, gateway 10.200.8.1 acquired from 168.63.129.16 Jan 28 01:19:16.751974 kernel: AES CTR mode by8 optimization enabled Jan 28 01:19:16.756059 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:19:16.758000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:16.844014 kernel: nvme nvme0: using unchecked data buffer Jan 28 01:19:16.933311 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. Jan 28 01:19:16.936816 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 28 01:19:17.028660 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. Jan 28 01:19:17.039074 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Jan 28 01:19:17.052351 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. Jan 28 01:19:17.171797 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 28 01:19:17.173000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:17.174759 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 28 01:19:17.177592 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 01:19:17.180217 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 28 01:19:17.186376 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 28 01:19:17.209078 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 28 01:19:17.210000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:17.903220 systemd-networkd[1476]: eth0: Gained IPv6LL Jan 28 01:19:18.126754 kernel: hv_netvsc f8615163-0000-1000-2000-6045bddd17f1 eth0: VF slot 1 added Jan 28 01:19:18.202982 disk-uuid[1666]: Warning: The kernel is still using the old partition table. Jan 28 01:19:18.202982 disk-uuid[1666]: The new table will be used at the next reboot or after you Jan 28 01:19:18.202982 disk-uuid[1666]: run partprobe(8) or kpartx(8) Jan 28 01:19:18.202982 disk-uuid[1666]: The operation has completed successfully. Jan 28 01:19:18.214705 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 28 01:19:18.214808 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 28 01:19:18.218000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:18.218000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:18.219848 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 28 01:19:18.261028 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1711) Jan 28 01:19:18.261155 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:19:18.262473 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:19:18.281142 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Jan 28 01:19:18.281177 kernel: BTRFS info (device nvme0n1p6): turning on async discard Jan 28 01:19:18.282092 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Jan 28 01:19:18.288225 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:19:18.288500 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 28 01:19:18.291000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:18.292816 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 28 01:19:19.129965 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 Jan 28 01:19:19.133459 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 Jan 28 01:19:19.133632 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] Jan 28 01:19:19.135287 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] Jan 28 01:19:19.140124 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint Jan 28 01:19:19.143202 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] Jan 28 01:19:19.147960 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] Jan 28 01:19:19.149971 kernel: pci 7870:00:00.0: enabling Extended Tags Jan 28 01:19:19.172847 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 Jan 28 01:19:19.173065 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned Jan 28 01:19:19.179983 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned Jan 28 01:19:19.196752 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) Jan 28 01:19:19.206978 kernel: mana 7870:00:00.0: Microsoft Azure Network Adapter protocol version: 0.1.1 Jan 28 01:19:19.207211 kernel: hv_netvsc f8615163-0000-1000-2000-6045bddd17f1 eth0: VF registering: eth1 Jan 28 01:19:19.209702 kernel: mana 7870:00:00.0 eth1: joined to eth0 Jan 28 01:19:19.213736 systemd-networkd[1476]: eth1: Interface name change detected, renamed to enP30832s1. Jan 28 01:19:19.217618 kernel: mana 7870:00:00.0 enP30832s1: renamed from eth1 Jan 28 01:19:19.284183 ignition[1730]: Ignition 2.24.0 Jan 28 01:19:19.285166 ignition[1730]: Stage: fetch-offline Jan 28 01:19:19.285424 ignition[1730]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:19:19.285435 ignition[1730]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jan 28 01:19:19.288502 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 28 01:19:19.285540 ignition[1730]: parsed url from cmdline: "" Jan 28 01:19:19.285543 ignition[1730]: no config URL provided Jan 28 01:19:19.291000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:19.293496 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 28 01:19:19.285547 ignition[1730]: reading system config file "/usr/lib/ignition/user.ign" Jan 28 01:19:19.285557 ignition[1730]: no config at "/usr/lib/ignition/user.ign" Jan 28 01:19:19.285562 ignition[1730]: failed to fetch config: resource requires networking Jan 28 01:19:19.285736 ignition[1730]: Ignition finished successfully Jan 28 01:19:19.313970 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Jan 28 01:19:19.314111 ignition[1738]: Ignition 2.24.0 Jan 28 01:19:19.314116 ignition[1738]: Stage: fetch Jan 28 01:19:19.314316 ignition[1738]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:19:19.319847 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Jan 28 01:19:19.320193 kernel: hv_netvsc f8615163-0000-1000-2000-6045bddd17f1 eth0: Data path switched to VF: enP30832s1 Jan 28 01:19:19.314324 ignition[1738]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jan 28 01:19:19.314390 ignition[1738]: parsed url from cmdline: "" Jan 28 01:19:19.323472 systemd-networkd[1476]: enP30832s1: Link UP Jan 28 01:19:19.314392 ignition[1738]: no config URL provided Jan 28 01:19:19.323600 systemd-networkd[1476]: enP30832s1: Gained carrier Jan 28 01:19:19.314396 ignition[1738]: reading system config file "/usr/lib/ignition/user.ign" Jan 28 01:19:19.314401 ignition[1738]: no config at "/usr/lib/ignition/user.ign" Jan 28 01:19:19.314421 ignition[1738]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Jan 28 01:19:19.546045 ignition[1738]: GET result: OK Jan 28 01:19:19.546117 ignition[1738]: config has been read from IMDS userdata Jan 28 01:19:19.546142 ignition[1738]: parsing config with SHA512: b7df9aa13c3ea0452f3b90447c5a8e3b87f1822cabe54aa8750e64876d3579527d530b0d2e3e09b307685bb63b1daf1f33641c8360e32d2cfca14641246c56df Jan 28 01:19:19.551243 unknown[1738]: fetched base config from "system" Jan 28 01:19:19.551250 unknown[1738]: fetched base config from "system" Jan 28 01:19:19.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:19.551540 ignition[1738]: fetch: fetch complete Jan 28 01:19:19.551255 unknown[1738]: fetched user config from "azure" Jan 28 01:19:19.551543 ignition[1738]: fetch: fetch passed Jan 28 01:19:19.553480 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 28 01:19:19.551576 ignition[1738]: Ignition finished successfully Jan 28 01:19:19.560051 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 28 01:19:19.581823 ignition[1744]: Ignition 2.24.0 Jan 28 01:19:19.581833 ignition[1744]: Stage: kargs Jan 28 01:19:19.582056 ignition[1744]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:19:19.584591 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 28 01:19:19.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:19.582062 ignition[1744]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jan 28 01:19:19.589320 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 28 01:19:19.582714 ignition[1744]: kargs: kargs passed Jan 28 01:19:19.582743 ignition[1744]: Ignition finished successfully Jan 28 01:19:19.607654 ignition[1750]: Ignition 2.24.0 Jan 28 01:19:19.607664 ignition[1750]: Stage: disks Jan 28 01:19:19.613000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:19.609503 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 28 01:19:19.607893 ignition[1750]: no configs at "/usr/lib/ignition/base.d" Jan 28 01:19:19.614530 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 28 01:19:19.607900 ignition[1750]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jan 28 01:19:19.616841 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 28 01:19:19.608593 ignition[1750]: disks: disks passed Jan 28 01:19:19.618445 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 28 01:19:19.608616 ignition[1750]: Ignition finished successfully Jan 28 01:19:19.619721 systemd[1]: Reached target sysinit.target - System Initialization. Jan 28 01:19:19.623004 systemd[1]: Reached target basic.target - Basic System. Jan 28 01:19:19.627078 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 28 01:19:19.699429 systemd-fsck[1758]: ROOT: clean, 15/6361680 files, 408771/6359552 blocks Jan 28 01:19:19.702935 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 28 01:19:19.703000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:19.707066 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 28 01:19:20.050964 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 60a46795-cc10-4076-a709-d039d1c23a6b r/w with ordered data mode. Quota mode: none. Jan 28 01:19:20.051615 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 28 01:19:20.055411 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 28 01:19:20.087706 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 28 01:19:20.093032 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 28 01:19:20.097022 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 28 01:19:20.102576 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 28 01:19:20.102611 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 28 01:19:20.106302 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 28 01:19:20.121050 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1767) Jan 28 01:19:20.121074 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:19:20.121090 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:19:20.121088 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 28 01:19:20.130681 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Jan 28 01:19:20.130707 kernel: BTRFS info (device nvme0n1p6): turning on async discard Jan 28 01:19:20.130715 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Jan 28 01:19:20.132182 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 28 01:19:20.717465 coreos-metadata[1769]: Jan 28 01:19:20.717 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Jan 28 01:19:20.728359 coreos-metadata[1769]: Jan 28 01:19:20.728 INFO Fetch successful Jan 28 01:19:20.731011 coreos-metadata[1769]: Jan 28 01:19:20.729 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Jan 28 01:19:20.739671 coreos-metadata[1769]: Jan 28 01:19:20.739 INFO Fetch successful Jan 28 01:19:20.741685 coreos-metadata[1769]: Jan 28 01:19:20.741 INFO wrote hostname ci-4593.0.0-n-84a137a86c to /sysroot/etc/hostname Jan 28 01:19:20.742000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:20.742443 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 28 01:19:21.915555 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 28 01:19:21.926552 kernel: kauditd_printk_skb: 13 callbacks suppressed Jan 28 01:19:21.926614 kernel: audit: type=1130 audit(1769563161.915:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:21.915000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:21.918859 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 28 01:19:21.930065 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 28 01:19:21.955294 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 28 01:19:21.959308 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:19:21.975619 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 28 01:19:21.978071 ignition[1870]: INFO : Ignition 2.24.0 Jan 28 01:19:21.978071 ignition[1870]: INFO : Stage: mount Jan 28 01:19:21.978071 ignition[1870]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 01:19:21.978071 ignition[1870]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Jan 28 01:19:21.978071 ignition[1870]: INFO : mount: mount passed Jan 28 01:19:21.978071 ignition[1870]: INFO : Ignition finished successfully Jan 28 01:19:22.005671 kernel: audit: type=1130 audit(1769563161.979:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:22.005696 kernel: audit: type=1130 audit(1769563161.986:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:21.979000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:21.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:21.981286 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 28 01:19:21.990472 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 28 01:19:22.009063 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 28 01:19:22.035962 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1881) Jan 28 01:19:22.036000 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 886243c7-f2f0-4861-ae6f-419cdf70e432 Jan 28 01:19:22.037964 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Jan 28 01:19:22.042972 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Jan 28 01:19:22.043006 kernel: BTRFS info (device nvme0n1p6): turning on async discard Jan 28 01:19:22.044045 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Jan 28 01:19:22.045733 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 28 01:19:22.066342 ignition[1897]: INFO : Ignition 2.24.0 Jan 28 01:19:22.066342 ignition[1897]: INFO : Stage: files Jan 28 01:19:22.070005 ignition[1897]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 01:19:22.070005 ignition[1897]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Jan 28 01:19:22.070005 ignition[1897]: DEBUG : files: compiled without relabeling support, skipping Jan 28 01:19:22.084031 ignition[1897]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 28 01:19:22.084031 ignition[1897]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 28 01:19:22.157757 ignition[1897]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 28 01:19:22.161045 ignition[1897]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 28 01:19:22.161045 ignition[1897]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 28 01:19:22.158647 unknown[1897]: wrote ssh authorized keys file for user: core Jan 28 01:19:22.173896 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 28 01:19:22.178002 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jan 28 01:19:38.204867 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 28 01:19:38.266738 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 28 01:19:38.272041 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 28 01:19:38.291972 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 28 01:19:38.291972 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 28 01:19:38.291972 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:19:38.291972 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:19:38.291972 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:19:38.291972 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Jan 28 01:19:38.669814 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 28 01:19:40.097880 ignition[1897]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 28 01:19:40.097880 ignition[1897]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 28 01:19:40.148547 ignition[1897]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 28 01:19:40.153477 ignition[1897]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 28 01:19:40.153477 ignition[1897]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 28 01:19:40.158484 ignition[1897]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 28 01:19:40.158484 ignition[1897]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 28 01:19:40.158484 ignition[1897]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 28 01:19:40.158484 ignition[1897]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 28 01:19:40.158484 ignition[1897]: INFO : files: files passed Jan 28 01:19:40.158484 ignition[1897]: INFO : Ignition finished successfully Jan 28 01:19:40.157285 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 28 01:19:40.172000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.176963 kernel: audit: type=1130 audit(1769563180.172:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.177075 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 28 01:19:40.181077 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 28 01:19:40.192467 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 28 01:19:40.196000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.192547 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 28 01:19:40.207082 kernel: audit: type=1130 audit(1769563180.196:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.207106 kernel: audit: type=1131 audit(1769563180.196:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.196000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.207156 initrd-setup-root-after-ignition[1929]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 28 01:19:40.207156 initrd-setup-root-after-ignition[1929]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 28 01:19:40.210720 initrd-setup-root-after-ignition[1933]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 28 01:19:40.213128 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 28 01:19:40.212000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.213821 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 28 01:19:40.225054 kernel: audit: type=1130 audit(1769563180.212:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.220510 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 28 01:19:40.251208 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 28 01:19:40.251286 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 28 01:19:40.254000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.255335 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 28 01:19:40.266524 kernel: audit: type=1130 audit(1769563180.254:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.266546 kernel: audit: type=1131 audit(1769563180.254:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.254000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.264346 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 28 01:19:40.268185 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 28 01:19:40.270462 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 28 01:19:40.290467 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 28 01:19:40.292000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.296325 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 28 01:19:40.297716 kernel: audit: type=1130 audit(1769563180.292:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.315367 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 28 01:19:40.315563 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 28 01:19:40.320128 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 01:19:40.324124 systemd[1]: Stopped target timers.target - Timer Units. Jan 28 01:19:40.328100 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 28 01:19:40.331000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.328207 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 28 01:19:40.335731 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 28 01:19:40.341116 kernel: audit: type=1131 audit(1769563180.331:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.339106 systemd[1]: Stopped target basic.target - Basic System. Jan 28 01:19:40.341184 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 28 01:19:40.341446 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 28 01:19:40.341738 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 28 01:19:40.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.342018 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 28 01:19:40.367739 kernel: audit: type=1131 audit(1769563180.356:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.342230 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 28 01:19:40.342535 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 28 01:19:40.372000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.342829 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 28 01:19:40.378185 kernel: audit: type=1131 audit(1769563180.372:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.342987 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 28 01:19:40.381000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.351214 systemd[1]: Stopped target swap.target - Swaps. Jan 28 01:19:40.385000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.355091 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 28 01:19:40.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.355221 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 28 01:19:40.358172 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 28 01:19:40.362713 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 01:19:40.397000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.366055 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 28 01:19:40.366205 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 01:19:40.368766 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 28 01:19:40.368865 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 28 01:19:40.378221 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 28 01:19:40.378353 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 28 01:19:40.382861 systemd[1]: ignition-files.service: Deactivated successfully. Jan 28 01:19:40.382970 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 28 01:19:40.386122 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 28 01:19:40.386249 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 28 01:19:40.421861 ignition[1953]: INFO : Ignition 2.24.0 Jan 28 01:19:40.421861 ignition[1953]: INFO : Stage: umount Jan 28 01:19:40.421861 ignition[1953]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 01:19:40.421861 ignition[1953]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Jan 28 01:19:40.421861 ignition[1953]: INFO : umount: umount passed Jan 28 01:19:40.421861 ignition[1953]: INFO : Ignition finished successfully Jan 28 01:19:40.390102 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 28 01:19:40.392691 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 28 01:19:40.393762 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 01:19:40.408760 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 28 01:19:40.419689 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 28 01:19:40.434000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.419833 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 01:19:40.438000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.435164 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 28 01:19:40.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.435294 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 01:19:40.447000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.439122 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 28 01:19:40.453000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.439214 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 28 01:19:40.447808 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 28 01:19:40.457000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.447897 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 28 01:19:40.450663 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 28 01:19:40.464000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.450812 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 28 01:19:40.467000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.454847 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 28 01:19:40.454896 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 28 01:19:40.458455 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 28 01:19:40.459680 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 28 01:19:40.465650 systemd[1]: Stopped target network.target - Network. Jan 28 01:19:40.468303 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 28 01:19:40.468348 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 28 01:19:40.468519 systemd[1]: Stopped target paths.target - Path Units. Jan 28 01:19:40.503000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.506000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.468538 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 28 01:19:40.468971 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 01:19:40.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.513000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.473197 systemd[1]: Stopped target slices.target - Slice Units. Jan 28 01:19:40.518000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.485120 systemd[1]: Stopped target sockets.target - Socket Units. Jan 28 01:19:40.489015 systemd[1]: iscsid.socket: Deactivated successfully. Jan 28 01:19:40.524000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.489079 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 28 01:19:40.527000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.490365 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 28 01:19:40.490776 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 28 01:19:40.531000 audit: BPF prog-id=6 op=UNLOAD Jan 28 01:19:40.532000 audit: BPF prog-id=9 op=UNLOAD Jan 28 01:19:40.493575 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 28 01:19:40.493600 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 28 01:19:40.498300 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 28 01:19:40.542000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.542000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.543000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.543000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.498355 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 28 01:19:40.504122 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 28 01:19:40.504162 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 28 01:19:40.507550 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 28 01:19:40.511087 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 28 01:19:40.513982 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 28 01:19:40.561000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.514518 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 28 01:19:40.570000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.570000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.514594 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 28 01:19:40.516460 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 28 01:19:40.576000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.516516 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 28 01:19:40.521436 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 28 01:19:40.521517 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 28 01:19:40.526312 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 28 01:19:40.526389 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 28 01:19:40.532050 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 28 01:19:40.594000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.534563 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 28 01:19:40.598000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.534608 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 28 01:19:40.602000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.537613 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 28 01:19:40.537667 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 28 01:19:40.543673 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 28 01:19:40.615662 kernel: hv_netvsc f8615163-0000-1000-2000-6045bddd17f1 eth0: Data path switched from VF: enP30832s1 Jan 28 01:19:40.615906 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Jan 28 01:19:40.612000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.613000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.543785 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 28 01:19:40.543827 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 28 01:19:40.544085 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 28 01:19:40.544118 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 28 01:19:40.622000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:40.544321 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 28 01:19:40.544348 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 28 01:19:40.546015 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 01:19:40.556468 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 28 01:19:40.556635 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 01:19:40.562805 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 28 01:19:40.562839 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 28 01:19:40.566763 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 28 01:19:40.566796 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 01:19:40.570031 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 28 01:19:40.570078 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 28 01:19:40.571064 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 28 01:19:40.571102 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 28 01:19:40.574587 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 28 01:19:40.574668 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 28 01:19:40.582690 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 28 01:19:40.588393 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 28 01:19:40.588455 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 01:19:40.595027 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 28 01:19:40.595073 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 01:19:40.599014 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:19:40.599052 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:19:40.611737 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 28 01:19:40.611815 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 28 01:19:40.620704 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 28 01:19:40.620780 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 28 01:19:40.623531 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 28 01:19:40.628843 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 28 01:19:40.655796 systemd[1]: Switching root. Jan 28 01:19:40.726278 systemd-journald[1106]: Journal stopped Jan 28 01:19:45.324933 systemd-journald[1106]: Received SIGTERM from PID 1 (systemd). Jan 28 01:19:45.324973 kernel: SELinux: policy capability network_peer_controls=1 Jan 28 01:19:45.324986 kernel: SELinux: policy capability open_perms=1 Jan 28 01:19:45.324997 kernel: SELinux: policy capability extended_socket_class=1 Jan 28 01:19:45.325004 kernel: SELinux: policy capability always_check_network=0 Jan 28 01:19:45.325009 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 28 01:19:45.325015 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 28 01:19:45.325022 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 28 01:19:45.325033 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 28 01:19:45.325041 kernel: SELinux: policy capability userspace_initial_context=0 Jan 28 01:19:45.325049 systemd[1]: Successfully loaded SELinux policy in 170.280ms. Jan 28 01:19:45.325056 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.880ms. Jan 28 01:19:45.325066 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 28 01:19:45.325080 systemd[1]: Detected virtualization microsoft. Jan 28 01:19:45.325087 systemd[1]: Detected architecture x86-64. Jan 28 01:19:45.325093 systemd[1]: Detected first boot. Jan 28 01:19:45.325100 systemd[1]: Hostname set to . Jan 28 01:19:45.327009 systemd[1]: Initializing machine ID from random generator. Jan 28 01:19:45.327027 zram_generator::config[1995]: No configuration found. Jan 28 01:19:45.327039 kernel: Guest personality initialized and is inactive Jan 28 01:19:45.327049 kernel: VMCI host device registered (name=vmci, major=10, minor=259) Jan 28 01:19:45.327058 kernel: Initialized host personality Jan 28 01:19:45.327066 kernel: NET: Registered PF_VSOCK protocol family Jan 28 01:19:45.327074 systemd[1]: Populated /etc with preset unit settings. Jan 28 01:19:45.327082 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 28 01:19:45.327089 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 28 01:19:45.327096 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 28 01:19:45.327106 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 28 01:19:45.327113 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 28 01:19:45.327120 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 28 01:19:45.327128 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 28 01:19:45.327136 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 28 01:19:45.327143 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 28 01:19:45.327149 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 28 01:19:45.327155 systemd[1]: Created slice user.slice - User and Session Slice. Jan 28 01:19:45.327162 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 01:19:45.327170 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 01:19:45.327176 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 28 01:19:45.327183 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 28 01:19:45.327189 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 28 01:19:45.327198 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 28 01:19:45.327206 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 28 01:19:45.327215 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 01:19:45.327222 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 28 01:19:45.327230 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 28 01:19:45.327238 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 28 01:19:45.327245 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 28 01:19:45.327253 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 28 01:19:45.327261 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 01:19:45.327270 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 28 01:19:45.327277 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 28 01:19:45.327285 systemd[1]: Reached target slices.target - Slice Units. Jan 28 01:19:45.327293 systemd[1]: Reached target swap.target - Swaps. Jan 28 01:19:45.327301 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 28 01:19:45.327309 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 28 01:19:45.327318 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 28 01:19:45.327326 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 28 01:19:45.327334 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 28 01:19:45.327341 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 28 01:19:45.327350 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 28 01:19:45.327358 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 28 01:19:45.327365 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 28 01:19:45.327373 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 01:19:45.327380 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 28 01:19:45.327388 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 28 01:19:45.327395 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 28 01:19:45.327404 systemd[1]: Mounting media.mount - External Media Directory... Jan 28 01:19:45.327412 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:45.327418 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 28 01:19:45.327424 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 28 01:19:45.327430 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 28 01:19:45.327437 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 28 01:19:45.327443 systemd[1]: Reached target machines.target - Containers. Jan 28 01:19:45.327451 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 28 01:19:45.327458 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:19:45.327464 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 28 01:19:45.327470 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 28 01:19:45.327477 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 01:19:45.327483 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 28 01:19:45.327490 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 28 01:19:45.327497 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 28 01:19:45.327503 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 28 01:19:45.327509 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 28 01:19:45.327515 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 28 01:19:45.327522 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 28 01:19:45.327527 kernel: kauditd_printk_skb: 52 callbacks suppressed Jan 28 01:19:45.327536 kernel: audit: type=1131 audit(1769563185.220:101): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.327542 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 28 01:19:45.327548 systemd[1]: Stopped systemd-fsck-usr.service. Jan 28 01:19:45.327555 kernel: audit: type=1131 audit(1769563185.230:102): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.327561 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:19:45.327569 kernel: audit: type=1334 audit(1769563185.237:103): prog-id=14 op=UNLOAD Jan 28 01:19:45.327575 kernel: audit: type=1334 audit(1769563185.237:104): prog-id=13 op=UNLOAD Jan 28 01:19:45.327581 kernel: audit: type=1334 audit(1769563185.237:105): prog-id=15 op=LOAD Jan 28 01:19:45.327588 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 28 01:19:45.327594 kernel: audit: type=1334 audit(1769563185.238:106): prog-id=16 op=LOAD Jan 28 01:19:45.327600 kernel: audit: type=1334 audit(1769563185.238:107): prog-id=17 op=LOAD Jan 28 01:19:45.327605 kernel: fuse: init (API version 7.41) Jan 28 01:19:45.327613 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 28 01:19:45.327619 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 28 01:19:45.327645 systemd-journald[2078]: Collecting audit messages is enabled. Jan 28 01:19:45.327662 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 28 01:19:45.327668 kernel: audit: type=1305 audit(1769563185.309:108): op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 28 01:19:45.327675 kernel: audit: type=1300 audit(1769563185.309:108): arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffc6e286950 a2=4000 a3=0 items=0 ppid=1 pid=2078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:45.327681 kernel: audit: type=1327 audit(1769563185.309:108): proctitle="/usr/lib/systemd/systemd-journald" Jan 28 01:19:45.327688 systemd-journald[2078]: Journal started Jan 28 01:19:45.327706 systemd-journald[2078]: Runtime Journal (/run/log/journal/d5ccb7624b52491d8322694391abdad9) is 8M, max 158.5M, 150.5M free. Jan 28 01:19:45.031000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 28 01:19:45.220000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.230000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.237000 audit: BPF prog-id=14 op=UNLOAD Jan 28 01:19:45.237000 audit: BPF prog-id=13 op=UNLOAD Jan 28 01:19:45.237000 audit: BPF prog-id=15 op=LOAD Jan 28 01:19:45.238000 audit: BPF prog-id=16 op=LOAD Jan 28 01:19:45.238000 audit: BPF prog-id=17 op=LOAD Jan 28 01:19:45.309000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 28 01:19:45.333032 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 28 01:19:45.309000 audit[2078]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffc6e286950 a2=4000 a3=0 items=0 ppid=1 pid=2078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:45.309000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 28 01:19:44.899875 systemd[1]: Queued start job for default target multi-user.target. Jan 28 01:19:44.910480 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Jan 28 01:19:44.910905 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 28 01:19:45.340185 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 28 01:19:45.346971 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:45.355967 systemd[1]: Started systemd-journald.service - Journal Service. Jan 28 01:19:45.353000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.355432 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 28 01:19:45.358851 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 28 01:19:45.361746 systemd[1]: Mounted media.mount - External Media Directory. Jan 28 01:19:45.366173 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 28 01:19:45.369130 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 28 01:19:45.372082 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 28 01:19:45.373261 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 01:19:45.374000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.375176 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 28 01:19:45.375309 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 28 01:19:45.375000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.375000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.376816 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 01:19:45.377088 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 01:19:45.377000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.377000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.378526 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 28 01:19:45.378727 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 28 01:19:45.381000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.381000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.382193 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 28 01:19:45.382318 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 28 01:19:45.383833 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 28 01:19:45.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.384229 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 28 01:19:45.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.386000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.388000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.387357 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 28 01:19:45.389831 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 01:19:45.390000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.391810 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 28 01:19:45.394000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.399774 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 28 01:19:45.401853 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 28 01:19:45.408039 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 28 01:19:45.410887 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 28 01:19:45.413025 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 28 01:19:45.413055 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 28 01:19:45.415351 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 28 01:19:45.425726 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:19:45.425833 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:19:45.450967 kernel: ACPI: bus type drm_connector registered Jan 28 01:19:45.455076 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 28 01:19:45.458132 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 28 01:19:45.460414 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 28 01:19:45.467046 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 28 01:19:45.468973 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 28 01:19:45.470100 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 28 01:19:45.475120 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 28 01:19:45.479781 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 28 01:19:45.484329 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 28 01:19:45.487000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.487000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.490000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.489053 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 28 01:19:45.492692 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 28 01:19:45.498212 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 28 01:19:45.506371 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 01:19:45.507000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.508305 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 28 01:19:45.508000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.511338 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 28 01:19:45.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.514524 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 28 01:19:45.517501 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 28 01:19:45.520201 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 28 01:19:45.527465 systemd-journald[2078]: Time spent on flushing to /var/log/journal/d5ccb7624b52491d8322694391abdad9 is 10.066ms for 1137 entries. Jan 28 01:19:45.527465 systemd-journald[2078]: System Journal (/var/log/journal/d5ccb7624b52491d8322694391abdad9) is 8M, max 2.2G, 2.2G free. Jan 28 01:19:45.563381 systemd-journald[2078]: Received client request to flush runtime journal. Jan 28 01:19:45.563420 kernel: loop1: detected capacity change from 0 to 50784 Jan 28 01:19:45.555000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.554121 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 28 01:19:45.564457 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 28 01:19:45.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.584689 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 28 01:19:45.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.697217 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 28 01:19:45.702861 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 28 01:19:45.700000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.701000 audit: BPF prog-id=18 op=LOAD Jan 28 01:19:45.701000 audit: BPF prog-id=19 op=LOAD Jan 28 01:19:45.701000 audit: BPF prog-id=20 op=LOAD Jan 28 01:19:45.704000 audit: BPF prog-id=21 op=LOAD Jan 28 01:19:45.707125 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 28 01:19:45.710055 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 28 01:19:45.713000 audit: BPF prog-id=22 op=LOAD Jan 28 01:19:45.715000 audit: BPF prog-id=23 op=LOAD Jan 28 01:19:45.715000 audit: BPF prog-id=24 op=LOAD Jan 28 01:19:45.717107 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 28 01:19:45.718000 audit: BPF prog-id=25 op=LOAD Jan 28 01:19:45.719000 audit: BPF prog-id=26 op=LOAD Jan 28 01:19:45.719000 audit: BPF prog-id=27 op=LOAD Jan 28 01:19:45.721150 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 28 01:19:45.764333 systemd-nsresourced[2156]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 28 01:19:45.765418 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 28 01:19:45.766000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.785478 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 28 01:19:45.787000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.808529 systemd-tmpfiles[2155]: ACLs are not supported, ignoring. Jan 28 01:19:45.808752 systemd-tmpfiles[2155]: ACLs are not supported, ignoring. Jan 28 01:19:45.814651 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 01:19:45.815000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.877206 systemd-oomd[2153]: No swap; memory pressure usage will be degraded Jan 28 01:19:45.878055 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 28 01:19:45.879000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:45.891975 kernel: loop2: detected capacity change from 0 to 111560 Jan 28 01:19:45.899455 systemd-resolved[2154]: Positive Trust Anchors: Jan 28 01:19:45.899467 systemd-resolved[2154]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 28 01:19:45.899471 systemd-resolved[2154]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 28 01:19:45.899502 systemd-resolved[2154]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 28 01:19:45.912020 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 28 01:19:46.084230 systemd-resolved[2154]: Using system hostname 'ci-4593.0.0-n-84a137a86c'. Jan 28 01:19:46.086129 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 28 01:19:46.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.088278 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 28 01:19:46.109969 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 28 01:19:46.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.110000 audit: BPF prog-id=8 op=UNLOAD Jan 28 01:19:46.110000 audit: BPF prog-id=7 op=UNLOAD Jan 28 01:19:46.111000 audit: BPF prog-id=28 op=LOAD Jan 28 01:19:46.111000 audit: BPF prog-id=29 op=LOAD Jan 28 01:19:46.112518 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 01:19:46.148804 systemd-udevd[2176]: Using default interface naming scheme 'v257'. Jan 28 01:19:46.246974 kernel: loop3: detected capacity change from 0 to 229808 Jan 28 01:19:46.296971 kernel: loop4: detected capacity change from 0 to 25512 Jan 28 01:19:46.301352 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 01:19:46.303000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.304000 audit: BPF prog-id=30 op=LOAD Jan 28 01:19:46.308479 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 28 01:19:46.363473 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 28 01:19:46.403838 systemd-networkd[2191]: lo: Link UP Jan 28 01:19:46.403844 systemd-networkd[2191]: lo: Gained carrier Jan 28 01:19:46.405723 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 28 01:19:46.407804 systemd-networkd[2191]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:19:46.407810 systemd-networkd[2191]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 28 01:19:46.408000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.409102 systemd[1]: Reached target network.target - Network. Jan 28 01:19:46.411973 kernel: hv_vmbus: registering driver hv_balloon Jan 28 01:19:46.412033 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Jan 28 01:19:46.415970 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Jan 28 01:19:46.416073 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 28 01:19:46.420161 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 28 01:19:46.427166 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Jan 28 01:19:46.431642 kernel: hv_netvsc f8615163-0000-1000-2000-6045bddd17f1 eth0: Data path switched to VF: enP30832s1 Jan 28 01:19:46.435366 systemd-networkd[2191]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:19:46.435420 systemd-networkd[2191]: enP30832s1: Link UP Jan 28 01:19:46.435780 kernel: hv_vmbus: registering driver hyperv_fb Jan 28 01:19:46.435804 kernel: mousedev: PS/2 mouse device common for all mice Jan 28 01:19:46.435499 systemd-networkd[2191]: eth0: Link UP Jan 28 01:19:46.435502 systemd-networkd[2191]: eth0: Gained carrier Jan 28 01:19:46.435511 systemd-networkd[2191]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:19:46.440314 systemd-networkd[2191]: enP30832s1: Gained carrier Jan 28 01:19:46.441485 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Jan 28 01:19:46.441522 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Jan 28 01:19:46.444170 kernel: Console: switching to colour dummy device 80x25 Jan 28 01:19:46.449384 kernel: Console: switching to colour frame buffer device 128x48 Jan 28 01:19:46.454006 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#80 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Jan 28 01:19:46.454287 systemd-networkd[2191]: eth0: DHCPv4 address 10.200.8.20/24, gateway 10.200.8.1 acquired from 168.63.129.16 Jan 28 01:19:46.477925 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 28 01:19:46.478000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-persistent-storage comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.525206 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:19:46.535382 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:19:46.535850 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:19:46.537000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.537000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.543183 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:19:46.603758 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 01:19:46.604045 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:19:46.605000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.605000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:46.608103 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 01:19:46.666988 kernel: kvm_intel: Using Hyper-V Enlightened VMCS Jan 28 01:19:46.692305 kernel: loop5: detected capacity change from 0 to 50784 Jan 28 01:19:46.707034 kernel: loop6: detected capacity change from 0 to 111560 Jan 28 01:19:46.723001 kernel: loop7: detected capacity change from 0 to 229808 Jan 28 01:19:46.735384 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Jan 28 01:19:46.739312 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 28 01:19:46.744977 kernel: loop1: detected capacity change from 0 to 25512 Jan 28 01:19:46.762037 (sd-merge)[2257]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-azure.raw'. Jan 28 01:19:46.764286 (sd-merge)[2257]: Merged extensions into '/usr'. Jan 28 01:19:46.766803 systemd[1]: Reload requested from client PID 2134 ('systemd-sysext') (unit systemd-sysext.service)... Jan 28 01:19:46.766815 systemd[1]: Reloading... Jan 28 01:19:46.816971 zram_generator::config[2300]: No configuration found. Jan 28 01:19:47.001671 systemd[1]: Reloading finished in 234 ms. Jan 28 01:19:47.033329 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 28 01:19:47.033000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.035254 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 01:19:47.037000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.039042 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 28 01:19:47.039000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.045995 systemd[1]: Starting ensure-sysext.service... Jan 28 01:19:47.049073 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 28 01:19:47.050000 audit: BPF prog-id=31 op=LOAD Jan 28 01:19:47.050000 audit: BPF prog-id=21 op=UNLOAD Jan 28 01:19:47.051000 audit: BPF prog-id=32 op=LOAD Jan 28 01:19:47.057000 audit: BPF prog-id=33 op=LOAD Jan 28 01:19:47.057000 audit: BPF prog-id=28 op=UNLOAD Jan 28 01:19:47.057000 audit: BPF prog-id=29 op=UNLOAD Jan 28 01:19:47.057000 audit: BPF prog-id=34 op=LOAD Jan 28 01:19:47.057000 audit: BPF prog-id=18 op=UNLOAD Jan 28 01:19:47.057000 audit: BPF prog-id=35 op=LOAD Jan 28 01:19:47.057000 audit: BPF prog-id=36 op=LOAD Jan 28 01:19:47.057000 audit: BPF prog-id=19 op=UNLOAD Jan 28 01:19:47.057000 audit: BPF prog-id=20 op=UNLOAD Jan 28 01:19:47.058000 audit: BPF prog-id=37 op=LOAD Jan 28 01:19:47.058000 audit: BPF prog-id=30 op=UNLOAD Jan 28 01:19:47.058000 audit: BPF prog-id=38 op=LOAD Jan 28 01:19:47.058000 audit: BPF prog-id=22 op=UNLOAD Jan 28 01:19:47.059000 audit: BPF prog-id=39 op=LOAD Jan 28 01:19:47.059000 audit: BPF prog-id=40 op=LOAD Jan 28 01:19:47.059000 audit: BPF prog-id=23 op=UNLOAD Jan 28 01:19:47.059000 audit: BPF prog-id=24 op=UNLOAD Jan 28 01:19:47.059000 audit: BPF prog-id=41 op=LOAD Jan 28 01:19:47.059000 audit: BPF prog-id=25 op=UNLOAD Jan 28 01:19:47.059000 audit: BPF prog-id=42 op=LOAD Jan 28 01:19:47.059000 audit: BPF prog-id=43 op=LOAD Jan 28 01:19:47.059000 audit: BPF prog-id=26 op=UNLOAD Jan 28 01:19:47.059000 audit: BPF prog-id=27 op=UNLOAD Jan 28 01:19:47.060000 audit: BPF prog-id=44 op=LOAD Jan 28 01:19:47.060000 audit: BPF prog-id=15 op=UNLOAD Jan 28 01:19:47.060000 audit: BPF prog-id=45 op=LOAD Jan 28 01:19:47.060000 audit: BPF prog-id=46 op=LOAD Jan 28 01:19:47.060000 audit: BPF prog-id=16 op=UNLOAD Jan 28 01:19:47.060000 audit: BPF prog-id=17 op=UNLOAD Jan 28 01:19:47.066991 systemd[1]: Reload requested from client PID 2361 ('systemctl') (unit ensure-sysext.service)... Jan 28 01:19:47.067006 systemd[1]: Reloading... Jan 28 01:19:47.074103 systemd-tmpfiles[2362]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 28 01:19:47.074336 systemd-tmpfiles[2362]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 28 01:19:47.074566 systemd-tmpfiles[2362]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 28 01:19:47.075546 systemd-tmpfiles[2362]: ACLs are not supported, ignoring. Jan 28 01:19:47.075647 systemd-tmpfiles[2362]: ACLs are not supported, ignoring. Jan 28 01:19:47.125971 zram_generator::config[2396]: No configuration found. Jan 28 01:19:47.162611 systemd-tmpfiles[2362]: Detected autofs mount point /boot during canonicalization of boot. Jan 28 01:19:47.162621 systemd-tmpfiles[2362]: Skipping /boot Jan 28 01:19:47.169059 systemd-tmpfiles[2362]: Detected autofs mount point /boot during canonicalization of boot. Jan 28 01:19:47.169069 systemd-tmpfiles[2362]: Skipping /boot Jan 28 01:19:47.293774 systemd[1]: Reloading finished in 226 ms. Jan 28 01:19:47.316000 audit: BPF prog-id=47 op=LOAD Jan 28 01:19:47.316000 audit: BPF prog-id=48 op=LOAD Jan 28 01:19:47.316000 audit: BPF prog-id=32 op=UNLOAD Jan 28 01:19:47.316000 audit: BPF prog-id=33 op=UNLOAD Jan 28 01:19:47.317000 audit: BPF prog-id=49 op=LOAD Jan 28 01:19:47.317000 audit: BPF prog-id=31 op=UNLOAD Jan 28 01:19:47.317000 audit: BPF prog-id=50 op=LOAD Jan 28 01:19:47.317000 audit: BPF prog-id=44 op=UNLOAD Jan 28 01:19:47.317000 audit: BPF prog-id=51 op=LOAD Jan 28 01:19:47.317000 audit: BPF prog-id=52 op=LOAD Jan 28 01:19:47.317000 audit: BPF prog-id=45 op=UNLOAD Jan 28 01:19:47.318000 audit: BPF prog-id=46 op=UNLOAD Jan 28 01:19:47.318000 audit: BPF prog-id=53 op=LOAD Jan 28 01:19:47.318000 audit: BPF prog-id=37 op=UNLOAD Jan 28 01:19:47.319000 audit: BPF prog-id=54 op=LOAD Jan 28 01:19:47.319000 audit: BPF prog-id=41 op=UNLOAD Jan 28 01:19:47.319000 audit: BPF prog-id=55 op=LOAD Jan 28 01:19:47.319000 audit: BPF prog-id=56 op=LOAD Jan 28 01:19:47.319000 audit: BPF prog-id=42 op=UNLOAD Jan 28 01:19:47.319000 audit: BPF prog-id=43 op=UNLOAD Jan 28 01:19:47.319000 audit: BPF prog-id=57 op=LOAD Jan 28 01:19:47.319000 audit: BPF prog-id=34 op=UNLOAD Jan 28 01:19:47.319000 audit: BPF prog-id=58 op=LOAD Jan 28 01:19:47.319000 audit: BPF prog-id=59 op=LOAD Jan 28 01:19:47.319000 audit: BPF prog-id=35 op=UNLOAD Jan 28 01:19:47.319000 audit: BPF prog-id=36 op=UNLOAD Jan 28 01:19:47.325000 audit: BPF prog-id=60 op=LOAD Jan 28 01:19:47.325000 audit: BPF prog-id=38 op=UNLOAD Jan 28 01:19:47.325000 audit: BPF prog-id=61 op=LOAD Jan 28 01:19:47.325000 audit: BPF prog-id=62 op=LOAD Jan 28 01:19:47.325000 audit: BPF prog-id=39 op=UNLOAD Jan 28 01:19:47.325000 audit: BPF prog-id=40 op=UNLOAD Jan 28 01:19:47.328293 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 01:19:47.330000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.336828 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 28 01:19:47.354242 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 28 01:19:47.357819 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 28 01:19:47.364419 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 28 01:19:47.367026 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 28 01:19:47.372459 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:47.372611 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:19:47.377153 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 01:19:47.377000 audit[2460]: SYSTEM_BOOT pid=2460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.381134 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 28 01:19:47.385142 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 28 01:19:47.386426 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:19:47.386605 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:19:47.386701 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:19:47.386792 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:47.392309 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 01:19:47.392468 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 01:19:47.393000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.394374 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 28 01:19:47.394775 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 28 01:19:47.396000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.402667 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:47.403161 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:19:47.405148 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 01:19:47.409132 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 28 01:19:47.410940 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:19:47.411106 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:19:47.411182 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:19:47.411254 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:47.412144 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 28 01:19:47.413000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.414625 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 28 01:19:47.414773 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 28 01:19:47.415000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.415000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.416000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.416000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.417135 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 01:19:47.417288 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 01:19:47.423925 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:47.424675 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 01:19:47.425606 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 01:19:47.431103 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 28 01:19:47.436229 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 28 01:19:47.438119 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 01:19:47.438276 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 01:19:47.438375 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 01:19:47.438513 systemd[1]: Reached target time-set.target - System Time Set. Jan 28 01:19:47.444186 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 28 01:19:47.445205 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 28 01:19:47.449086 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 28 01:19:47.449000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.449000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.450880 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 01:19:47.451096 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 01:19:47.451000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.451000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.452876 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 28 01:19:47.453114 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 28 01:19:47.456000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.456000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.457420 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 28 01:19:47.457641 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 28 01:19:47.458000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.458000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.462364 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 28 01:19:47.462531 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 28 01:19:47.463590 systemd[1]: Finished ensure-sysext.service. Jan 28 01:19:47.465000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.490294 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 28 01:19:47.490000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:19:47.661000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 28 01:19:47.661000 audit[2500]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd1805ff60 a2=420 a3=0 items=0 ppid=2456 pid=2500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:19:47.661000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 01:19:47.663079 augenrules[2500]: No rules Jan 28 01:19:47.663374 systemd[1]: audit-rules.service: Deactivated successfully. Jan 28 01:19:47.663589 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 28 01:19:48.239093 systemd-networkd[2191]: eth0: Gained IPv6LL Jan 28 01:19:48.241231 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 28 01:19:48.243142 systemd[1]: Reached target network-online.target - Network is Online. Jan 28 01:19:48.570966 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 28 01:19:48.572632 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 28 01:19:52.613289 ldconfig[2458]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 28 01:19:52.630806 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 28 01:19:52.633671 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 28 01:19:52.649266 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 28 01:19:52.650728 systemd[1]: Reached target sysinit.target - System Initialization. Jan 28 01:19:52.653110 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 28 01:19:52.656055 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 28 01:19:52.659017 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 28 01:19:52.662157 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 28 01:19:52.665071 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 28 01:19:52.668025 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 28 01:19:52.671045 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 28 01:19:52.674012 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 28 01:19:52.675299 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 28 01:19:52.675333 systemd[1]: Reached target paths.target - Path Units. Jan 28 01:19:52.676152 systemd[1]: Reached target timers.target - Timer Units. Jan 28 01:19:52.677928 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 28 01:19:52.681837 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 28 01:19:52.686385 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 28 01:19:52.689105 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 28 01:19:52.692013 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 28 01:19:52.694666 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 28 01:19:52.697224 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 28 01:19:52.700482 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 28 01:19:52.703741 systemd[1]: Reached target sockets.target - Socket Units. Jan 28 01:19:52.704931 systemd[1]: Reached target basic.target - Basic System. Jan 28 01:19:52.705915 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 28 01:19:52.705939 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 28 01:19:52.720177 systemd[1]: Starting chronyd.service - NTP client/server... Jan 28 01:19:52.721847 systemd[1]: Starting containerd.service - containerd container runtime... Jan 28 01:19:52.725146 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 28 01:19:52.728144 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 28 01:19:52.739166 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 28 01:19:52.745025 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 28 01:19:52.748056 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 28 01:19:52.750774 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 28 01:19:52.752067 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 28 01:19:52.754117 systemd[1]: hv_fcopy_uio_daemon.service - Hyper-V FCOPY UIO daemon was skipped because of an unmet condition check (ConditionPathExists=/sys/bus/vmbus/devices/eb765408-105f-49b6-b4aa-c123b64d17d4/uio). Jan 28 01:19:52.755168 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Jan 28 01:19:52.756848 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Jan 28 01:19:52.757730 jq[2518]: false Jan 28 01:19:52.758349 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:19:52.762126 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 28 01:19:52.767134 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 28 01:19:52.775422 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 28 01:19:52.780216 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 28 01:19:52.793352 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 28 01:19:52.795301 KVP[2524]: KVP starting; pid is:2524 Jan 28 01:19:52.800396 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 28 01:19:52.802129 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 28 01:19:52.802507 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 28 01:19:52.804292 kernel: hv_utils: KVP IC version 4.0 Jan 28 01:19:52.805000 KVP[2524]: KVP LIC Version: 3.1 Jan 28 01:19:52.805605 systemd[1]: Starting update-engine.service - Update Engine... Jan 28 01:19:52.812318 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 28 01:19:52.820346 extend-filesystems[2522]: Found /dev/nvme0n1p6 Jan 28 01:19:52.824276 extend-filesystems[2522]: Found /dev/nvme0n1p9 Jan 28 01:19:52.825802 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 28 01:19:52.826035 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 28 01:19:52.826437 extend-filesystems[2522]: Checking size of /dev/nvme0n1p9 Jan 28 01:19:52.835346 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 28 01:19:52.835560 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 28 01:19:52.843267 systemd[1]: motdgen.service: Deactivated successfully. Jan 28 01:19:52.843710 google_oslogin_nss_cache[2523]: oslogin_cache_refresh[2523]: Refreshing passwd entry cache Jan 28 01:19:52.843720 oslogin_cache_refresh[2523]: Refreshing passwd entry cache Jan 28 01:19:52.844867 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 28 01:19:52.847804 jq[2539]: true Jan 28 01:19:52.849274 chronyd[2513]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 28 01:19:52.867999 chronyd[2513]: Timezone right/UTC failed leap second check, ignoring Jan 28 01:19:52.868140 chronyd[2513]: Loaded seccomp filter (level 2) Jan 28 01:19:52.868425 systemd[1]: Started chronyd.service - NTP client/server. Jan 28 01:19:52.869266 update_engine[2537]: I20260128 01:19:52.869203 2537 main.cc:92] Flatcar Update Engine starting Jan 28 01:19:52.870400 google_oslogin_nss_cache[2523]: oslogin_cache_refresh[2523]: Failure getting users, quitting Jan 28 01:19:52.870451 google_oslogin_nss_cache[2523]: oslogin_cache_refresh[2523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 28 01:19:52.870451 google_oslogin_nss_cache[2523]: oslogin_cache_refresh[2523]: Refreshing group entry cache Jan 28 01:19:52.870398 oslogin_cache_refresh[2523]: Failure getting users, quitting Jan 28 01:19:52.870413 oslogin_cache_refresh[2523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 28 01:19:52.870449 oslogin_cache_refresh[2523]: Refreshing group entry cache Jan 28 01:19:52.876333 extend-filesystems[2522]: Resized partition /dev/nvme0n1p9 Jan 28 01:19:52.881253 extend-filesystems[2573]: resize2fs 1.47.3 (8-Jul-2025) Jan 28 01:19:52.884969 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 6359552 to 6376955 blocks Jan 28 01:19:52.886967 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 6376955 Jan 28 01:19:52.888097 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 28 01:19:52.910701 jq[2553]: true Jan 28 01:19:52.910788 google_oslogin_nss_cache[2523]: oslogin_cache_refresh[2523]: Failure getting groups, quitting Jan 28 01:19:52.910788 google_oslogin_nss_cache[2523]: oslogin_cache_refresh[2523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 28 01:19:52.893045 oslogin_cache_refresh[2523]: Failure getting groups, quitting Jan 28 01:19:52.910793 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 28 01:19:52.893053 oslogin_cache_refresh[2523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 28 01:19:52.913097 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 28 01:19:52.935904 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 28 01:19:52.942105 extend-filesystems[2573]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Jan 28 01:19:52.942105 extend-filesystems[2573]: old_desc_blocks = 4, new_desc_blocks = 4 Jan 28 01:19:52.942105 extend-filesystems[2573]: The filesystem on /dev/nvme0n1p9 is now 6376955 (4k) blocks long. Jan 28 01:19:52.954043 extend-filesystems[2522]: Resized filesystem in /dev/nvme0n1p9 Jan 28 01:19:52.947370 systemd-logind[2536]: New seat seat0. Jan 28 01:19:52.956694 tar[2549]: linux-amd64/LICENSE Jan 28 01:19:52.956694 tar[2549]: linux-amd64/helm Jan 28 01:19:52.948891 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 28 01:19:52.950152 systemd-logind[2536]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Jan 28 01:19:52.951343 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 28 01:19:52.958662 systemd[1]: Started systemd-logind.service - User Login Management. Jan 28 01:19:53.015371 bash[2601]: Updated "/home/core/.ssh/authorized_keys" Jan 28 01:19:53.017379 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 28 01:19:53.021391 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jan 28 01:19:53.108716 dbus-daemon[2516]: [system] SELinux support is enabled Jan 28 01:19:53.108925 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 28 01:19:53.116452 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 28 01:19:53.117382 dbus-daemon[2516]: [system] Successfully activated service 'org.freedesktop.systemd1' Jan 28 01:19:53.116479 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 28 01:19:53.119490 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 28 01:19:53.119807 update_engine[2537]: I20260128 01:19:53.119553 2537 update_check_scheduler.cc:74] Next update check in 9m56s Jan 28 01:19:53.119510 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 28 01:19:53.123274 systemd[1]: Started update-engine.service - Update Engine. Jan 28 01:19:53.133193 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 28 01:19:53.196975 coreos-metadata[2515]: Jan 28 01:19:53.196 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Jan 28 01:19:53.220998 coreos-metadata[2515]: Jan 28 01:19:53.220 INFO Fetch successful Jan 28 01:19:53.220998 coreos-metadata[2515]: Jan 28 01:19:53.220 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Jan 28 01:19:53.220998 coreos-metadata[2515]: Jan 28 01:19:53.220 INFO Fetch successful Jan 28 01:19:53.220998 coreos-metadata[2515]: Jan 28 01:19:53.220 INFO Fetching http://168.63.129.16/machine/aff5e774-af79-4e14-88be-363a61de164c/5f44e9d9%2D62e2%2D4898%2D9cdf%2D52e2ad867c68.%5Fci%2D4593.0.0%2Dn%2D84a137a86c?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Jan 28 01:19:53.220998 coreos-metadata[2515]: Jan 28 01:19:53.220 INFO Fetch successful Jan 28 01:19:53.220998 coreos-metadata[2515]: Jan 28 01:19:53.220 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Jan 28 01:19:53.221783 coreos-metadata[2515]: Jan 28 01:19:53.221 INFO Fetch successful Jan 28 01:19:53.289979 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 28 01:19:53.293348 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 28 01:19:53.348001 sshd_keygen[2565]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 28 01:19:53.376345 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 28 01:19:53.387040 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 28 01:19:53.393150 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Jan 28 01:19:53.425406 systemd[1]: issuegen.service: Deactivated successfully. Jan 28 01:19:53.425822 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 28 01:19:53.434160 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 28 01:19:53.456007 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Jan 28 01:19:53.469703 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 28 01:19:53.475220 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 28 01:19:53.480069 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 28 01:19:53.482353 systemd[1]: Reached target getty.target - Login Prompts. Jan 28 01:19:53.488669 locksmithd[2630]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 28 01:19:53.564023 tar[2549]: linux-amd64/README.md Jan 28 01:19:53.575520 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 28 01:19:53.970969 containerd[2557]: time="2026-01-28T01:19:53Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 28 01:19:53.971720 containerd[2557]: time="2026-01-28T01:19:53.971681446Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 28 01:19:53.980267 containerd[2557]: time="2026-01-28T01:19:53.980232381Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.37µs" Jan 28 01:19:53.980267 containerd[2557]: time="2026-01-28T01:19:53.980259224Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 28 01:19:53.980366 containerd[2557]: time="2026-01-28T01:19:53.980290254Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 28 01:19:53.980366 containerd[2557]: time="2026-01-28T01:19:53.980300160Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980408635Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980421313Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980462368Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980472343Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980645095Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980654709Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980669026Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980676517Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980781571Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980793688Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 28 01:19:53.980942 containerd[2557]: time="2026-01-28T01:19:53.980847605Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 28 01:19:53.981174 containerd[2557]: time="2026-01-28T01:19:53.980999848Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 28 01:19:53.981174 containerd[2557]: time="2026-01-28T01:19:53.981022072Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 28 01:19:53.981174 containerd[2557]: time="2026-01-28T01:19:53.981030484Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 28 01:19:53.981174 containerd[2557]: time="2026-01-28T01:19:53.981059442Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 28 01:19:53.981362 containerd[2557]: time="2026-01-28T01:19:53.981316333Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 28 01:19:53.981412 containerd[2557]: time="2026-01-28T01:19:53.981384925Z" level=info msg="metadata content store policy set" policy=shared Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992591041Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992645042Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992791131Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992804759Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992816694Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992827625Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992837737Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992846740Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992856992Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992867092Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992875766Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992886036Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992895257Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 28 01:19:53.993496 containerd[2557]: time="2026-01-28T01:19:53.992906232Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993005532Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993021028Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993033295Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993043143Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993052049Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993060344Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993070561Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993083970Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993096284Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993106054Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993114942Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993133168Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993171490Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993182574Z" level=info msg="Start snapshots syncer" Jan 28 01:19:53.993795 containerd[2557]: time="2026-01-28T01:19:53.993206317Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 28 01:19:53.994062 containerd[2557]: time="2026-01-28T01:19:53.993519426Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 28 01:19:53.994062 containerd[2557]: time="2026-01-28T01:19:53.993562656Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993609460Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993715104Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993731362Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993740189Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993748994Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993759132Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993768793Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993779246Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993787932Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993797032Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993821177Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993832293Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 28 01:19:53.994188 containerd[2557]: time="2026-01-28T01:19:53.993840157Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993848331Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993855102Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993864005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993912949Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993928269Z" level=info msg="runtime interface created" Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993933949Z" level=info msg="created NRI interface" Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993944599Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993968546Z" level=info msg="Connect containerd service" Jan 28 01:19:53.994397 containerd[2557]: time="2026-01-28T01:19:53.993986849Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 28 01:19:53.995178 containerd[2557]: time="2026-01-28T01:19:53.995144438Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 28 01:19:54.043137 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:19:54.054266 (kubelet)[2686]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:19:54.418117 containerd[2557]: time="2026-01-28T01:19:54.417988639Z" level=info msg="Start subscribing containerd event" Jan 28 01:19:54.419121 containerd[2557]: time="2026-01-28T01:19:54.418199117Z" level=info msg="Start recovering state" Jan 28 01:19:54.419289 containerd[2557]: time="2026-01-28T01:19:54.419081250Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 28 01:19:54.419354 containerd[2557]: time="2026-01-28T01:19:54.419342370Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 28 01:19:54.419476 containerd[2557]: time="2026-01-28T01:19:54.419465082Z" level=info msg="Start event monitor" Jan 28 01:19:54.419655 containerd[2557]: time="2026-01-28T01:19:54.419647003Z" level=info msg="Start cni network conf syncer for default" Jan 28 01:19:54.419690 containerd[2557]: time="2026-01-28T01:19:54.419684473Z" level=info msg="Start streaming server" Jan 28 01:19:54.419723 containerd[2557]: time="2026-01-28T01:19:54.419716666Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 28 01:19:54.419761 containerd[2557]: time="2026-01-28T01:19:54.419755254Z" level=info msg="runtime interface starting up..." Jan 28 01:19:54.419791 containerd[2557]: time="2026-01-28T01:19:54.419782858Z" level=info msg="starting plugins..." Jan 28 01:19:54.419831 containerd[2557]: time="2026-01-28T01:19:54.419825297Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 28 01:19:54.421231 containerd[2557]: time="2026-01-28T01:19:54.421055194Z" level=info msg="containerd successfully booted in 0.451198s" Jan 28 01:19:54.421237 systemd[1]: Started containerd.service - containerd container runtime. Jan 28 01:19:54.423539 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 28 01:19:54.427542 systemd[1]: Startup finished in 4.123s (kernel) + 27.161s (initrd) + 12.430s (userspace) = 43.716s. Jan 28 01:19:54.595339 kubelet[2686]: E0128 01:19:54.595297 2686 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:19:54.597151 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:19:54.597273 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:19:54.597643 systemd[1]: kubelet.service: Consumed 826ms CPU time, 265.9M memory peak. Jan 28 01:19:54.681102 waagent[2663]: 2026-01-28T01:19:54.680982Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Jan 28 01:19:54.682594 waagent[2663]: 2026-01-28T01:19:54.682206Z INFO Daemon Daemon OS: flatcar 4593.0.0 Jan 28 01:19:54.683562 waagent[2663]: 2026-01-28T01:19:54.682829Z INFO Daemon Daemon Python: 3.12.11 Jan 28 01:19:54.684537 waagent[2663]: 2026-01-28T01:19:54.684235Z INFO Daemon Daemon Run daemon Jan 28 01:19:54.685757 waagent[2663]: 2026-01-28T01:19:54.685526Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4593.0.0' Jan 28 01:19:54.687449 waagent[2663]: 2026-01-28T01:19:54.687411Z INFO Daemon Daemon Using waagent for provisioning Jan 28 01:19:54.688796 waagent[2663]: 2026-01-28T01:19:54.688763Z INFO Daemon Daemon Activate resource disk Jan 28 01:19:54.689885 waagent[2663]: 2026-01-28T01:19:54.689822Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Jan 28 01:19:54.692673 waagent[2663]: 2026-01-28T01:19:54.692635Z INFO Daemon Daemon Found device: None Jan 28 01:19:54.693206 waagent[2663]: 2026-01-28T01:19:54.693174Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Jan 28 01:19:54.693291 waagent[2663]: 2026-01-28T01:19:54.693265Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Jan 28 01:19:54.697402 waagent[2663]: 2026-01-28T01:19:54.697359Z INFO Daemon Daemon Clean protocol and wireserver endpoint Jan 28 01:19:54.698547 waagent[2663]: 2026-01-28T01:19:54.698517Z INFO Daemon Daemon Running default provisioning handler Jan 28 01:19:54.704795 waagent[2663]: 2026-01-28T01:19:54.704671Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Jan 28 01:19:54.708006 waagent[2663]: 2026-01-28T01:19:54.707719Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Jan 28 01:19:54.708978 waagent[2663]: 2026-01-28T01:19:54.708286Z INFO Daemon Daemon cloud-init is enabled: False Jan 28 01:19:54.708978 waagent[2663]: 2026-01-28T01:19:54.708715Z INFO Daemon Daemon Copying ovf-env.xml Jan 28 01:19:54.807153 waagent[2663]: 2026-01-28T01:19:54.805670Z INFO Daemon Daemon Successfully mounted dvd Jan 28 01:19:54.831350 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Jan 28 01:19:54.832754 waagent[2663]: 2026-01-28T01:19:54.832708Z INFO Daemon Daemon Detect protocol endpoint Jan 28 01:19:54.835520 waagent[2663]: 2026-01-28T01:19:54.833189Z INFO Daemon Daemon Clean protocol and wireserver endpoint Jan 28 01:19:54.835520 waagent[2663]: 2026-01-28T01:19:54.833322Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Jan 28 01:19:54.835520 waagent[2663]: 2026-01-28T01:19:54.833583Z INFO Daemon Daemon Test for route to 168.63.129.16 Jan 28 01:19:54.835520 waagent[2663]: 2026-01-28T01:19:54.833738Z INFO Daemon Daemon Route to 168.63.129.16 exists Jan 28 01:19:54.835520 waagent[2663]: 2026-01-28T01:19:54.833843Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Jan 28 01:19:54.848399 waagent[2663]: 2026-01-28T01:19:54.848369Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Jan 28 01:19:54.849972 waagent[2663]: 2026-01-28T01:19:54.849007Z INFO Daemon Daemon Wire protocol version:2012-11-30 Jan 28 01:19:54.849972 waagent[2663]: 2026-01-28T01:19:54.849174Z INFO Daemon Daemon Server preferred version:2015-04-05 Jan 28 01:19:54.925323 waagent[2663]: 2026-01-28T01:19:54.925270Z INFO Daemon Daemon Initializing goal state during protocol detection Jan 28 01:19:54.926582 waagent[2663]: 2026-01-28T01:19:54.926452Z INFO Daemon Daemon Forcing an update of the goal state. Jan 28 01:19:54.935666 waagent[2663]: 2026-01-28T01:19:54.935472Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Jan 28 01:19:54.947083 waagent[2663]: 2026-01-28T01:19:54.947050Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.177 Jan 28 01:19:54.948266 waagent[2663]: 2026-01-28T01:19:54.948230Z INFO Daemon Jan 28 01:19:54.948621 waagent[2663]: 2026-01-28T01:19:54.948594Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 4c3135ca-6b80-48e7-8bc9-67604e0dadac eTag: 5636434372206832783 source: Fabric] Jan 28 01:19:54.950767 waagent[2663]: 2026-01-28T01:19:54.950735Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Jan 28 01:19:54.952049 waagent[2663]: 2026-01-28T01:19:54.952020Z INFO Daemon Jan 28 01:19:54.952509 waagent[2663]: 2026-01-28T01:19:54.952363Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Jan 28 01:19:54.956227 waagent[2663]: 2026-01-28T01:19:54.956203Z INFO Daemon Daemon Downloading artifacts profile blob Jan 28 01:19:55.005666 login[2671]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:55.005666 login[2672]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:55.011753 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 28 01:19:55.012637 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 28 01:19:55.017640 systemd-logind[2536]: New session 2 of user core. Jan 28 01:19:55.021418 systemd-logind[2536]: New session 1 of user core. Jan 28 01:19:55.048305 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 28 01:19:55.049686 waagent[2663]: 2026-01-28T01:19:55.049646Z INFO Daemon Downloaded certificate {'thumbprint': '9664995065BFE3A7F4CDCC6E731CC1C8E427E441', 'hasPrivateKey': True} Jan 28 01:19:55.053010 waagent[2663]: 2026-01-28T01:19:55.052944Z INFO Daemon Fetch goal state completed Jan 28 01:19:55.053036 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 28 01:19:55.060475 waagent[2663]: 2026-01-28T01:19:55.060444Z INFO Daemon Daemon Starting provisioning Jan 28 01:19:55.060785 waagent[2663]: 2026-01-28T01:19:55.060718Z INFO Daemon Daemon Handle ovf-env.xml. Jan 28 01:19:55.061612 waagent[2663]: 2026-01-28T01:19:55.061551Z INFO Daemon Daemon Set hostname [ci-4593.0.0-n-84a137a86c] Jan 28 01:19:55.066023 (systemd)[2725]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:19:55.067975 systemd-logind[2536]: New session 3 of user core. Jan 28 01:19:55.089539 waagent[2663]: 2026-01-28T01:19:55.089498Z INFO Daemon Daemon Publish hostname [ci-4593.0.0-n-84a137a86c] Jan 28 01:19:55.090868 waagent[2663]: 2026-01-28T01:19:55.090830Z INFO Daemon Daemon Examine /proc/net/route for primary interface Jan 28 01:19:55.092719 waagent[2663]: 2026-01-28T01:19:55.091638Z INFO Daemon Daemon Primary interface is [eth0] Jan 28 01:19:55.099198 systemd-networkd[2191]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 01:19:55.099210 systemd-networkd[2191]: eth0: Reconfiguring with /usr/lib/systemd/network/zz-default.network. Jan 28 01:19:55.099258 systemd-networkd[2191]: eth0: DHCP lease lost Jan 28 01:19:55.112974 waagent[2663]: 2026-01-28T01:19:55.111495Z INFO Daemon Daemon Create user account if not exists Jan 28 01:19:55.112974 waagent[2663]: 2026-01-28T01:19:55.112015Z INFO Daemon Daemon User core already exists, skip useradd Jan 28 01:19:55.112974 waagent[2663]: 2026-01-28T01:19:55.112495Z INFO Daemon Daemon Configure sudoer Jan 28 01:19:55.118899 waagent[2663]: 2026-01-28T01:19:55.118418Z INFO Daemon Daemon Configure sshd Jan 28 01:19:55.121005 systemd-networkd[2191]: eth0: DHCPv4 address 10.200.8.20/24, gateway 10.200.8.1 acquired from 168.63.129.16 Jan 28 01:19:55.122163 waagent[2663]: 2026-01-28T01:19:55.122126Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Jan 28 01:19:55.125859 waagent[2663]: 2026-01-28T01:19:55.122560Z INFO Daemon Daemon Deploy ssh public key. Jan 28 01:19:55.217673 systemd[2725]: Queued start job for default target default.target. Jan 28 01:19:55.223884 systemd[2725]: Created slice app.slice - User Application Slice. Jan 28 01:19:55.223920 systemd[2725]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 28 01:19:55.223936 systemd[2725]: Reached target paths.target - Paths. Jan 28 01:19:55.223991 systemd[2725]: Reached target timers.target - Timers. Jan 28 01:19:55.224843 systemd[2725]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 28 01:19:55.225482 systemd[2725]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 28 01:19:55.235091 systemd[2725]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 28 01:19:55.235347 systemd[2725]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 28 01:19:55.235925 systemd[2725]: Reached target sockets.target - Sockets. Jan 28 01:19:55.235986 systemd[2725]: Reached target basic.target - Basic System. Jan 28 01:19:55.236014 systemd[2725]: Reached target default.target - Main User Target. Jan 28 01:19:55.236032 systemd[2725]: Startup finished in 164ms. Jan 28 01:19:55.236183 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 28 01:19:55.247095 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 28 01:19:55.247797 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 28 01:19:56.202861 waagent[2663]: 2026-01-28T01:19:56.202804Z INFO Daemon Daemon Provisioning complete Jan 28 01:19:56.211293 waagent[2663]: 2026-01-28T01:19:56.211257Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Jan 28 01:19:56.212456 waagent[2663]: 2026-01-28T01:19:56.212387Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Jan 28 01:19:56.213402 waagent[2663]: 2026-01-28T01:19:56.212937Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Jan 28 01:19:56.317968 waagent[2763]: 2026-01-28T01:19:56.317909Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Jan 28 01:19:56.318225 waagent[2763]: 2026-01-28T01:19:56.318022Z INFO ExtHandler ExtHandler OS: flatcar 4593.0.0 Jan 28 01:19:56.318225 waagent[2763]: 2026-01-28T01:19:56.318070Z INFO ExtHandler ExtHandler Python: 3.12.11 Jan 28 01:19:56.318225 waagent[2763]: 2026-01-28T01:19:56.318107Z INFO ExtHandler ExtHandler CPU Arch: x86_64 Jan 28 01:19:56.367089 waagent[2763]: 2026-01-28T01:19:56.367040Z INFO ExtHandler ExtHandler Distro: flatcar-4593.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.12.11; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Jan 28 01:19:56.367238 waagent[2763]: 2026-01-28T01:19:56.367212Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Jan 28 01:19:56.367292 waagent[2763]: 2026-01-28T01:19:56.367273Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Jan 28 01:19:56.376431 waagent[2763]: 2026-01-28T01:19:56.376371Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Jan 28 01:19:56.380136 waagent[2763]: 2026-01-28T01:19:56.380105Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.177 Jan 28 01:19:56.380439 waagent[2763]: 2026-01-28T01:19:56.380412Z INFO ExtHandler Jan 28 01:19:56.380489 waagent[2763]: 2026-01-28T01:19:56.380467Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: c0dae0ef-844d-44ec-b165-e1adcf30cff6 eTag: 5636434372206832783 source: Fabric] Jan 28 01:19:56.380673 waagent[2763]: 2026-01-28T01:19:56.380647Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Jan 28 01:19:56.381011 waagent[2763]: 2026-01-28T01:19:56.380983Z INFO ExtHandler Jan 28 01:19:56.381041 waagent[2763]: 2026-01-28T01:19:56.381030Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Jan 28 01:19:56.386100 waagent[2763]: 2026-01-28T01:19:56.386068Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Jan 28 01:19:56.445650 waagent[2763]: 2026-01-28T01:19:56.445605Z INFO ExtHandler Downloaded certificate {'thumbprint': '9664995065BFE3A7F4CDCC6E731CC1C8E427E441', 'hasPrivateKey': True} Jan 28 01:19:56.445975 waagent[2763]: 2026-01-28T01:19:56.445935Z INFO ExtHandler Fetch goal state completed Jan 28 01:19:56.459365 waagent[2763]: 2026-01-28T01:19:56.459293Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.5.4 30 Sep 2025 (Library: OpenSSL 3.5.4 30 Sep 2025) Jan 28 01:19:56.463155 waagent[2763]: 2026-01-28T01:19:56.463115Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2763 Jan 28 01:19:56.463265 waagent[2763]: 2026-01-28T01:19:56.463242Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Jan 28 01:19:56.463511 waagent[2763]: 2026-01-28T01:19:56.463487Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Jan 28 01:19:56.464574 waagent[2763]: 2026-01-28T01:19:56.464536Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4593.0.0', '', 'Flatcar Container Linux by Kinvolk'] Jan 28 01:19:56.464849 waagent[2763]: 2026-01-28T01:19:56.464820Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4593.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Jan 28 01:19:56.464981 waagent[2763]: 2026-01-28T01:19:56.464939Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Jan 28 01:19:56.465382 waagent[2763]: 2026-01-28T01:19:56.465355Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Jan 28 01:19:56.484235 waagent[2763]: 2026-01-28T01:19:56.484211Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Jan 28 01:19:56.484348 waagent[2763]: 2026-01-28T01:19:56.484328Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Jan 28 01:19:56.489456 waagent[2763]: 2026-01-28T01:19:56.489123Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Jan 28 01:19:56.493920 systemd[1]: Reload requested from client PID 2778 ('systemctl') (unit waagent.service)... Jan 28 01:19:56.493933 systemd[1]: Reloading... Jan 28 01:19:56.557978 zram_generator::config[2819]: No configuration found. Jan 28 01:19:56.723684 systemd[1]: Reloading finished in 229 ms. Jan 28 01:19:56.750904 waagent[2763]: 2026-01-28T01:19:56.750172Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Jan 28 01:19:56.750904 waagent[2763]: 2026-01-28T01:19:56.750318Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Jan 28 01:19:56.880313 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#118 cmd 0x4a status: scsi 0x0 srb 0x20 hv 0xc0000001 Jan 28 01:19:57.897195 waagent[2763]: 2026-01-28T01:19:57.897121Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Jan 28 01:19:57.897521 waagent[2763]: 2026-01-28T01:19:57.897478Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Jan 28 01:19:57.898158 waagent[2763]: 2026-01-28T01:19:57.898106Z INFO ExtHandler ExtHandler Starting env monitor service. Jan 28 01:19:57.898229 waagent[2763]: 2026-01-28T01:19:57.898200Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Jan 28 01:19:57.898541 waagent[2763]: 2026-01-28T01:19:57.898516Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Jan 28 01:19:57.898592 waagent[2763]: 2026-01-28T01:19:57.898571Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Jan 28 01:19:57.898788 waagent[2763]: 2026-01-28T01:19:57.898766Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Jan 28 01:19:57.898961 waagent[2763]: 2026-01-28T01:19:57.898926Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Jan 28 01:19:57.899157 waagent[2763]: 2026-01-28T01:19:57.899126Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Jan 28 01:19:57.899314 waagent[2763]: 2026-01-28T01:19:57.899290Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Jan 28 01:19:57.899448 waagent[2763]: 2026-01-28T01:19:57.899426Z INFO EnvHandler ExtHandler Configure routes Jan 28 01:19:57.899611 waagent[2763]: 2026-01-28T01:19:57.899593Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Jan 28 01:19:57.899664 waagent[2763]: 2026-01-28T01:19:57.899642Z INFO EnvHandler ExtHandler Gateway:None Jan 28 01:19:57.899893 waagent[2763]: 2026-01-28T01:19:57.899848Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Jan 28 01:19:57.899930 waagent[2763]: 2026-01-28T01:19:57.899905Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Jan 28 01:19:57.900017 waagent[2763]: 2026-01-28T01:19:57.899994Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Jan 28 01:19:57.900017 waagent[2763]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Jan 28 01:19:57.900017 waagent[2763]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 Jan 28 01:19:57.900017 waagent[2763]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Jan 28 01:19:57.900017 waagent[2763]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Jan 28 01:19:57.900017 waagent[2763]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Jan 28 01:19:57.900017 waagent[2763]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Jan 28 01:19:57.900183 waagent[2763]: 2026-01-28T01:19:57.900162Z INFO EnvHandler ExtHandler Routes:None Jan 28 01:19:57.900373 waagent[2763]: 2026-01-28T01:19:57.900355Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Jan 28 01:19:57.905476 waagent[2763]: 2026-01-28T01:19:57.905445Z INFO ExtHandler ExtHandler Jan 28 01:19:57.905815 waagent[2763]: 2026-01-28T01:19:57.905790Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: d12a5e7c-cdd4-4a3c-856b-f86e39d762b7 correlation 37a46cbf-ea00-4e94-b4ca-e1de7e4b3c19 created: 2026-01-28T01:18:50.983986Z] Jan 28 01:19:57.906927 waagent[2763]: 2026-01-28T01:19:57.906885Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Jan 28 01:19:57.908504 waagent[2763]: 2026-01-28T01:19:57.908478Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 3 ms] Jan 28 01:19:57.938558 waagent[2763]: 2026-01-28T01:19:57.938514Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Jan 28 01:19:57.938558 waagent[2763]: Try `iptables -h' or 'iptables --help' for more information.) Jan 28 01:19:57.938834 waagent[2763]: 2026-01-28T01:19:57.938810Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: A0DFDCE5-4127-4526-8166-35AE272C3C8E;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Jan 28 01:19:57.947343 waagent[2763]: 2026-01-28T01:19:57.947300Z INFO MonitorHandler ExtHandler Network interfaces: Jan 28 01:19:57.947343 waagent[2763]: Executing ['ip', '-a', '-o', 'link']: Jan 28 01:19:57.947343 waagent[2763]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Jan 28 01:19:57.947343 waagent[2763]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 60:45:bd:dd:17:f1 brd ff:ff:ff:ff:ff:ff\ alias Network Device\ altname enx6045bddd17f1 Jan 28 01:19:57.947343 waagent[2763]: 3: enP30832s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 60:45:bd:dd:17:f1 brd ff:ff:ff:ff:ff:ff\ altname enP30832p0s0 Jan 28 01:19:57.947343 waagent[2763]: Executing ['ip', '-4', '-a', '-o', 'address']: Jan 28 01:19:57.947343 waagent[2763]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Jan 28 01:19:57.947343 waagent[2763]: 2: eth0 inet 10.200.8.20/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever Jan 28 01:19:57.947343 waagent[2763]: Executing ['ip', '-6', '-a', '-o', 'address']: Jan 28 01:19:57.947343 waagent[2763]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Jan 28 01:19:57.947343 waagent[2763]: 2: eth0 inet6 fe80::6245:bdff:fedd:17f1/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Jan 28 01:19:58.006223 waagent[2763]: 2026-01-28T01:19:58.006180Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Jan 28 01:19:58.006223 waagent[2763]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Jan 28 01:19:58.006223 waagent[2763]: pkts bytes target prot opt in out source destination Jan 28 01:19:58.006223 waagent[2763]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Jan 28 01:19:58.006223 waagent[2763]: pkts bytes target prot opt in out source destination Jan 28 01:19:58.006223 waagent[2763]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Jan 28 01:19:58.006223 waagent[2763]: pkts bytes target prot opt in out source destination Jan 28 01:19:58.006223 waagent[2763]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Jan 28 01:19:58.006223 waagent[2763]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Jan 28 01:19:58.006223 waagent[2763]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Jan 28 01:19:58.008562 waagent[2763]: 2026-01-28T01:19:58.008519Z INFO EnvHandler ExtHandler Current Firewall rules: Jan 28 01:19:58.008562 waagent[2763]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Jan 28 01:19:58.008562 waagent[2763]: pkts bytes target prot opt in out source destination Jan 28 01:19:58.008562 waagent[2763]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Jan 28 01:19:58.008562 waagent[2763]: pkts bytes target prot opt in out source destination Jan 28 01:19:58.008562 waagent[2763]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Jan 28 01:19:58.008562 waagent[2763]: pkts bytes target prot opt in out source destination Jan 28 01:19:58.008562 waagent[2763]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Jan 28 01:19:58.008562 waagent[2763]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Jan 28 01:19:58.008562 waagent[2763]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Jan 28 01:20:04.677120 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 28 01:20:04.679158 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:20:05.058103 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:05.069183 (kubelet)[2918]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:20:05.099967 kubelet[2918]: E0128 01:20:05.099909 2918 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:20:05.102634 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:20:05.102752 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:20:05.103096 systemd[1]: kubelet.service: Consumed 118ms CPU time, 109M memory peak. Jan 28 01:20:07.439334 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 28 01:20:07.440460 systemd[1]: Started sshd@0-10.200.8.20:22-10.200.16.10:43370.service - OpenSSH per-connection server daemon (10.200.16.10:43370). Jan 28 01:20:08.093577 sshd[2926]: Accepted publickey for core from 10.200.16.10 port 43370 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:20:08.094621 sshd-session[2926]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:08.098750 systemd-logind[2536]: New session 4 of user core. Jan 28 01:20:08.105117 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 28 01:20:08.501401 systemd[1]: Started sshd@1-10.200.8.20:22-10.200.16.10:43376.service - OpenSSH per-connection server daemon (10.200.16.10:43376). Jan 28 01:20:09.040986 sshd[2933]: Accepted publickey for core from 10.200.16.10 port 43376 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:20:09.041893 sshd-session[2933]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:09.045537 systemd-logind[2536]: New session 5 of user core. Jan 28 01:20:09.052101 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 28 01:20:09.342936 sshd[2937]: Connection closed by 10.200.16.10 port 43376 Jan 28 01:20:09.344094 sshd-session[2933]: pam_unix(sshd:session): session closed for user core Jan 28 01:20:09.346646 systemd[1]: sshd@1-10.200.8.20:22-10.200.16.10:43376.service: Deactivated successfully. Jan 28 01:20:09.348264 systemd[1]: session-5.scope: Deactivated successfully. Jan 28 01:20:09.349828 systemd-logind[2536]: Session 5 logged out. Waiting for processes to exit. Jan 28 01:20:09.350918 systemd-logind[2536]: Removed session 5. Jan 28 01:20:09.463398 systemd[1]: Started sshd@2-10.200.8.20:22-10.200.16.10:43388.service - OpenSSH per-connection server daemon (10.200.16.10:43388). Jan 28 01:20:09.995201 sshd[2943]: Accepted publickey for core from 10.200.16.10 port 43388 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:20:09.996299 sshd-session[2943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:10.000005 systemd-logind[2536]: New session 6 of user core. Jan 28 01:20:10.009126 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 28 01:20:10.293477 sshd[2947]: Connection closed by 10.200.16.10 port 43388 Jan 28 01:20:10.294095 sshd-session[2943]: pam_unix(sshd:session): session closed for user core Jan 28 01:20:10.297259 systemd-logind[2536]: Session 6 logged out. Waiting for processes to exit. Jan 28 01:20:10.297837 systemd[1]: sshd@2-10.200.8.20:22-10.200.16.10:43388.service: Deactivated successfully. Jan 28 01:20:10.299375 systemd[1]: session-6.scope: Deactivated successfully. Jan 28 01:20:10.300572 systemd-logind[2536]: Removed session 6. Jan 28 01:20:10.407370 systemd[1]: Started sshd@3-10.200.8.20:22-10.200.16.10:43306.service - OpenSSH per-connection server daemon (10.200.16.10:43306). Jan 28 01:20:10.942608 sshd[2953]: Accepted publickey for core from 10.200.16.10 port 43306 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:20:10.943666 sshd-session[2953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:10.947807 systemd-logind[2536]: New session 7 of user core. Jan 28 01:20:10.957129 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 28 01:20:11.244073 sshd[2957]: Connection closed by 10.200.16.10 port 43306 Jan 28 01:20:11.244608 sshd-session[2953]: pam_unix(sshd:session): session closed for user core Jan 28 01:20:11.247686 systemd-logind[2536]: Session 7 logged out. Waiting for processes to exit. Jan 28 01:20:11.248037 systemd[1]: sshd@3-10.200.8.20:22-10.200.16.10:43306.service: Deactivated successfully. Jan 28 01:20:11.249314 systemd[1]: session-7.scope: Deactivated successfully. Jan 28 01:20:11.250673 systemd-logind[2536]: Removed session 7. Jan 28 01:20:11.358198 systemd[1]: Started sshd@4-10.200.8.20:22-10.200.16.10:43316.service - OpenSSH per-connection server daemon (10.200.16.10:43316). Jan 28 01:20:11.891774 sshd[2963]: Accepted publickey for core from 10.200.16.10 port 43316 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:20:11.892848 sshd-session[2963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:11.896801 systemd-logind[2536]: New session 8 of user core. Jan 28 01:20:11.903107 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 28 01:20:12.210859 sudo[2968]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 28 01:20:12.211097 sudo[2968]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:20:12.219600 sudo[2968]: pam_unix(sudo:session): session closed for user root Jan 28 01:20:12.330765 sshd[2967]: Connection closed by 10.200.16.10 port 43316 Jan 28 01:20:12.332154 sshd-session[2963]: pam_unix(sshd:session): session closed for user core Jan 28 01:20:12.335438 systemd[1]: sshd@4-10.200.8.20:22-10.200.16.10:43316.service: Deactivated successfully. Jan 28 01:20:12.337097 systemd[1]: session-8.scope: Deactivated successfully. Jan 28 01:20:12.337791 systemd-logind[2536]: Session 8 logged out. Waiting for processes to exit. Jan 28 01:20:12.339190 systemd-logind[2536]: Removed session 8. Jan 28 01:20:12.440301 systemd[1]: Started sshd@5-10.200.8.20:22-10.200.16.10:43324.service - OpenSSH per-connection server daemon (10.200.16.10:43324). Jan 28 01:20:12.982876 sshd[2975]: Accepted publickey for core from 10.200.16.10 port 43324 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:20:12.983964 sshd-session[2975]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:12.988113 systemd-logind[2536]: New session 9 of user core. Jan 28 01:20:12.995109 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 28 01:20:13.186705 sudo[2981]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 28 01:20:13.186968 sudo[2981]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:20:13.190593 sudo[2981]: pam_unix(sudo:session): session closed for user root Jan 28 01:20:13.195027 sudo[2980]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 28 01:20:13.195230 sudo[2980]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:20:13.200908 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 28 01:20:13.229836 kernel: kauditd_printk_skb: 144 callbacks suppressed Jan 28 01:20:13.229896 kernel: audit: type=1305 audit(1769563213.226:251): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 28 01:20:13.226000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 28 01:20:13.229324 systemd[1]: audit-rules.service: Deactivated successfully. Jan 28 01:20:13.230050 augenrules[3005]: No rules Jan 28 01:20:13.229917 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 28 01:20:13.234011 kernel: audit: type=1300 audit(1769563213.226:251): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffe1237e0f0 a2=420 a3=0 items=0 ppid=2986 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:13.226000 audit[3005]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffe1237e0f0 a2=420 a3=0 items=0 ppid=2986 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:13.233508 sudo[2980]: pam_unix(sudo:session): session closed for user root Jan 28 01:20:13.226000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 01:20:13.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.237858 kernel: audit: type=1327 audit(1769563213.226:251): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 01:20:13.237892 kernel: audit: type=1130 audit(1769563213.229:252): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.237909 kernel: audit: type=1131 audit(1769563213.229:253): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.229000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.232000 audit[2980]: USER_END pid=2980 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.242498 kernel: audit: type=1106 audit(1769563213.232:254): pid=2980 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.242529 kernel: audit: type=1104 audit(1769563213.232:255): pid=2980 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.232000 audit[2980]: CRED_DISP pid=2980 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.333273 sshd[2979]: Connection closed by 10.200.16.10 port 43324 Jan 28 01:20:13.334101 sshd-session[2975]: pam_unix(sshd:session): session closed for user core Jan 28 01:20:13.334000 audit[2975]: USER_END pid=2975 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:13.341056 kernel: audit: type=1106 audit(1769563213.334:256): pid=2975 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:13.341118 kernel: audit: type=1104 audit(1769563213.334:257): pid=2975 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:13.334000 audit[2975]: CRED_DISP pid=2975 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:13.340545 systemd-logind[2536]: Session 9 logged out. Waiting for processes to exit. Jan 28 01:20:13.341372 systemd[1]: sshd@5-10.200.8.20:22-10.200.16.10:43324.service: Deactivated successfully. Jan 28 01:20:13.345752 kernel: audit: type=1131 audit(1769563213.340:258): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.200.8.20:22-10.200.16.10:43324 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.340000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.200.8.20:22-10.200.16.10:43324 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.345583 systemd[1]: session-9.scope: Deactivated successfully. Jan 28 01:20:13.347591 systemd-logind[2536]: Removed session 9. Jan 28 01:20:13.450000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.200.8.20:22-10.200.16.10:43340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:13.451256 systemd[1]: Started sshd@6-10.200.8.20:22-10.200.16.10:43340.service - OpenSSH per-connection server daemon (10.200.16.10:43340). Jan 28 01:20:13.984000 audit[3014]: USER_ACCT pid=3014 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:13.985370 sshd[3014]: Accepted publickey for core from 10.200.16.10 port 43340 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:20:13.985000 audit[3014]: CRED_ACQ pid=3014 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:13.985000 audit[3014]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdc8111570 a2=3 a3=0 items=0 ppid=1 pid=3014 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:13.985000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:20:13.986513 sshd-session[3014]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:20:13.990586 systemd-logind[2536]: New session 10 of user core. Jan 28 01:20:13.995141 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 28 01:20:13.996000 audit[3014]: USER_START pid=3014 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:13.997000 audit[3018]: CRED_ACQ pid=3018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:14.188000 audit[3019]: USER_ACCT pid=3019 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:14.189782 sudo[3019]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 28 01:20:14.190023 sudo[3019]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 01:20:14.188000 audit[3019]: CRED_REFR pid=3019 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:14.189000 audit[3019]: USER_START pid=3019 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:15.177226 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 28 01:20:15.178902 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:20:15.795936 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:15.795000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:15.807133 (kubelet)[3046]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:20:15.843593 kubelet[3046]: E0128 01:20:15.843541 3046 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:20:15.845193 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:20:15.845317 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:20:15.844000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:20:15.845654 systemd[1]: kubelet.service: Consumed 121ms CPU time, 110.7M memory peak. Jan 28 01:20:16.098544 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 28 01:20:16.104213 (dockerd)[3053]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 28 01:20:16.651276 chronyd[2513]: Selected source PHC0 Jan 28 01:20:17.621116 dockerd[3053]: time="2026-01-28T01:20:17.621062282Z" level=info msg="Starting up" Jan 28 01:20:17.621768 dockerd[3053]: time="2026-01-28T01:20:17.621730376Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 28 01:20:17.631092 dockerd[3053]: time="2026-01-28T01:20:17.631057450Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 28 01:20:17.653677 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport796616317-merged.mount: Deactivated successfully. Jan 28 01:20:17.761753 dockerd[3053]: time="2026-01-28T01:20:17.761708946Z" level=info msg="Loading containers: start." Jan 28 01:20:17.771994 kernel: Initializing XFRM netlink socket Jan 28 01:20:17.807000 audit[3098]: NETFILTER_CFG table=nat:5 family=2 entries=2 op=nft_register_chain pid=3098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.807000 audit[3098]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffe4e807760 a2=0 a3=0 items=0 ppid=3053 pid=3098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.807000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 28 01:20:17.809000 audit[3100]: NETFILTER_CFG table=filter:6 family=2 entries=2 op=nft_register_chain pid=3100 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.809000 audit[3100]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe47d28630 a2=0 a3=0 items=0 ppid=3053 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.809000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 28 01:20:17.810000 audit[3102]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=3102 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.810000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffc5141080 a2=0 a3=0 items=0 ppid=3053 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.810000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 28 01:20:17.812000 audit[3104]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=3104 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.812000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd7bd286d0 a2=0 a3=0 items=0 ppid=3053 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.812000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 28 01:20:17.813000 audit[3106]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_chain pid=3106 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.813000 audit[3106]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc7bb27b70 a2=0 a3=0 items=0 ppid=3053 pid=3106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.813000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 28 01:20:17.815000 audit[3108]: NETFILTER_CFG table=filter:10 family=2 entries=1 op=nft_register_chain pid=3108 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.815000 audit[3108]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff02054fb0 a2=0 a3=0 items=0 ppid=3053 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.815000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:20:17.816000 audit[3110]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_register_chain pid=3110 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.816000 audit[3110]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffea60ed120 a2=0 a3=0 items=0 ppid=3053 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.816000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 01:20:17.818000 audit[3112]: NETFILTER_CFG table=nat:12 family=2 entries=2 op=nft_register_chain pid=3112 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.818000 audit[3112]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffdd0d78990 a2=0 a3=0 items=0 ppid=3053 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.818000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 28 01:20:17.862000 audit[3115]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=3115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.862000 audit[3115]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffd06352e10 a2=0 a3=0 items=0 ppid=3053 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.862000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 28 01:20:17.864000 audit[3117]: NETFILTER_CFG table=filter:14 family=2 entries=2 op=nft_register_chain pid=3117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.864000 audit[3117]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffdc3f57740 a2=0 a3=0 items=0 ppid=3053 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.864000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 28 01:20:17.866000 audit[3119]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.866000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fff10d7a0f0 a2=0 a3=0 items=0 ppid=3053 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.866000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 28 01:20:17.868000 audit[3121]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=3121 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.868000 audit[3121]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fffc62e18e0 a2=0 a3=0 items=0 ppid=3053 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.868000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:20:17.871000 audit[3123]: NETFILTER_CFG table=filter:17 family=2 entries=1 op=nft_register_rule pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.871000 audit[3123]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffdf22d80c0 a2=0 a3=0 items=0 ppid=3053 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.871000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 28 01:20:17.934000 audit[3153]: NETFILTER_CFG table=nat:18 family=10 entries=2 op=nft_register_chain pid=3153 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.934000 audit[3153]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd76abdc70 a2=0 a3=0 items=0 ppid=3053 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.934000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 28 01:20:17.936000 audit[3155]: NETFILTER_CFG table=filter:19 family=10 entries=2 op=nft_register_chain pid=3155 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.936000 audit[3155]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff3ab7f130 a2=0 a3=0 items=0 ppid=3053 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.936000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 28 01:20:17.938000 audit[3157]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.938000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe678f7160 a2=0 a3=0 items=0 ppid=3053 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.938000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 28 01:20:17.939000 audit[3159]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.939000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd7bd14320 a2=0 a3=0 items=0 ppid=3053 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.939000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 28 01:20:17.941000 audit[3161]: NETFILTER_CFG table=filter:22 family=10 entries=1 op=nft_register_chain pid=3161 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.941000 audit[3161]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe8ad6aa40 a2=0 a3=0 items=0 ppid=3053 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.941000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 28 01:20:17.943000 audit[3163]: NETFILTER_CFG table=filter:23 family=10 entries=1 op=nft_register_chain pid=3163 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.943000 audit[3163]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd23490cf0 a2=0 a3=0 items=0 ppid=3053 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.943000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:20:17.945000 audit[3165]: NETFILTER_CFG table=filter:24 family=10 entries=1 op=nft_register_chain pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.945000 audit[3165]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff34fb3260 a2=0 a3=0 items=0 ppid=3053 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.945000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 01:20:17.946000 audit[3167]: NETFILTER_CFG table=nat:25 family=10 entries=2 op=nft_register_chain pid=3167 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.946000 audit[3167]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffcfa2bf590 a2=0 a3=0 items=0 ppid=3053 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.946000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 28 01:20:17.948000 audit[3169]: NETFILTER_CFG table=nat:26 family=10 entries=2 op=nft_register_chain pid=3169 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.948000 audit[3169]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffc7c5798f0 a2=0 a3=0 items=0 ppid=3053 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.948000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 28 01:20:17.950000 audit[3171]: NETFILTER_CFG table=filter:27 family=10 entries=2 op=nft_register_chain pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.950000 audit[3171]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff3cc05410 a2=0 a3=0 items=0 ppid=3053 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.950000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 28 01:20:17.951000 audit[3173]: NETFILTER_CFG table=filter:28 family=10 entries=1 op=nft_register_rule pid=3173 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.951000 audit[3173]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fff70c60c50 a2=0 a3=0 items=0 ppid=3053 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.951000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 28 01:20:17.953000 audit[3175]: NETFILTER_CFG table=filter:29 family=10 entries=1 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.953000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffca2dc5fa0 a2=0 a3=0 items=0 ppid=3053 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.953000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 01:20:17.955000 audit[3177]: NETFILTER_CFG table=filter:30 family=10 entries=1 op=nft_register_rule pid=3177 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.955000 audit[3177]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffccd8fdfa0 a2=0 a3=0 items=0 ppid=3053 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.955000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 28 01:20:17.959000 audit[3182]: NETFILTER_CFG table=filter:31 family=2 entries=1 op=nft_register_chain pid=3182 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.959000 audit[3182]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd7a55a850 a2=0 a3=0 items=0 ppid=3053 pid=3182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.959000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 28 01:20:17.960000 audit[3184]: NETFILTER_CFG table=filter:32 family=2 entries=1 op=nft_register_rule pid=3184 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.960000 audit[3184]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffdb76d6740 a2=0 a3=0 items=0 ppid=3053 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.960000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 28 01:20:17.962000 audit[3186]: NETFILTER_CFG table=filter:33 family=2 entries=1 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:17.962000 audit[3186]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe47c00640 a2=0 a3=0 items=0 ppid=3053 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.962000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 28 01:20:17.963000 audit[3188]: NETFILTER_CFG table=filter:34 family=10 entries=1 op=nft_register_chain pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.963000 audit[3188]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd6a19e170 a2=0 a3=0 items=0 ppid=3053 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.963000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 28 01:20:17.965000 audit[3190]: NETFILTER_CFG table=filter:35 family=10 entries=1 op=nft_register_rule pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.965000 audit[3190]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffdfad5de00 a2=0 a3=0 items=0 ppid=3053 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.965000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 28 01:20:17.966000 audit[3192]: NETFILTER_CFG table=filter:36 family=10 entries=1 op=nft_register_rule pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:17.966000 audit[3192]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd7fbc4030 a2=0 a3=0 items=0 ppid=3053 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:17.966000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 28 01:20:18.014000 audit[3197]: NETFILTER_CFG table=nat:37 family=2 entries=2 op=nft_register_chain pid=3197 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.014000 audit[3197]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffd6cb6f810 a2=0 a3=0 items=0 ppid=3053 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.014000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 28 01:20:18.016000 audit[3199]: NETFILTER_CFG table=nat:38 family=2 entries=1 op=nft_register_rule pid=3199 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.016000 audit[3199]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffe6d7eb2a0 a2=0 a3=0 items=0 ppid=3053 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.016000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 28 01:20:18.023000 audit[3207]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=3207 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.023000 audit[3207]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffca12b84a0 a2=0 a3=0 items=0 ppid=3053 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.023000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 28 01:20:18.027000 audit[3212]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=3212 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.027000 audit[3212]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffd06e087d0 a2=0 a3=0 items=0 ppid=3053 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.027000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 28 01:20:18.029000 audit[3214]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.029000 audit[3214]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7fffeeb2f100 a2=0 a3=0 items=0 ppid=3053 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.029000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 28 01:20:18.030000 audit[3216]: NETFILTER_CFG table=filter:42 family=2 entries=1 op=nft_register_rule pid=3216 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.030000 audit[3216]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc5a88f5e0 a2=0 a3=0 items=0 ppid=3053 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.030000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 28 01:20:18.032000 audit[3218]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.032000 audit[3218]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7fffa5b1b270 a2=0 a3=0 items=0 ppid=3053 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.032000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 01:20:18.034000 audit[3220]: NETFILTER_CFG table=filter:44 family=2 entries=1 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:18.034000 audit[3220]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffcea535110 a2=0 a3=0 items=0 ppid=3053 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:18.034000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 28 01:20:18.035477 systemd-networkd[2191]: docker0: Link UP Jan 28 01:20:18.046170 dockerd[3053]: time="2026-01-28T01:20:18.046124248Z" level=info msg="Loading containers: done." Jan 28 01:20:18.056747 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2568382898-merged.mount: Deactivated successfully. Jan 28 01:20:18.088672 dockerd[3053]: time="2026-01-28T01:20:18.088636820Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 28 01:20:18.088774 dockerd[3053]: time="2026-01-28T01:20:18.088704417Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 28 01:20:18.088774 dockerd[3053]: time="2026-01-28T01:20:18.088771342Z" level=info msg="Initializing buildkit" Jan 28 01:20:18.123062 dockerd[3053]: time="2026-01-28T01:20:18.123022068Z" level=info msg="Completed buildkit initialization" Jan 28 01:20:18.128990 dockerd[3053]: time="2026-01-28T01:20:18.128715242Z" level=info msg="Daemon has completed initialization" Jan 28 01:20:18.128990 dockerd[3053]: time="2026-01-28T01:20:18.128808122Z" level=info msg="API listen on /run/docker.sock" Jan 28 01:20:18.129572 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 28 01:20:18.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:19.121378 containerd[2557]: time="2026-01-28T01:20:19.121342441Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 28 01:20:20.194004 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1915550292.mount: Deactivated successfully. Jan 28 01:20:21.188700 containerd[2557]: time="2026-01-28T01:20:21.188654203Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:21.190741 containerd[2557]: time="2026-01-28T01:20:21.190632119Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=28445968" Jan 28 01:20:21.192767 containerd[2557]: time="2026-01-28T01:20:21.192745913Z" level=info msg="ImageCreate event name:\"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:21.195735 containerd[2557]: time="2026-01-28T01:20:21.195708668Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:21.196371 containerd[2557]: time="2026-01-28T01:20:21.196345436Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"30111311\" in 2.074970615s" Jan 28 01:20:21.196412 containerd[2557]: time="2026-01-28T01:20:21.196374357Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\"" Jan 28 01:20:21.196995 containerd[2557]: time="2026-01-28T01:20:21.196974986Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 28 01:20:22.667665 containerd[2557]: time="2026-01-28T01:20:22.667619265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:22.669668 containerd[2557]: time="2026-01-28T01:20:22.669559097Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=26008626" Jan 28 01:20:22.672259 containerd[2557]: time="2026-01-28T01:20:22.672238327Z" level=info msg="ImageCreate event name:\"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:22.675331 containerd[2557]: time="2026-01-28T01:20:22.675303714Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:22.675935 containerd[2557]: time="2026-01-28T01:20:22.675795648Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"27673815\" in 1.478797963s" Jan 28 01:20:22.675935 containerd[2557]: time="2026-01-28T01:20:22.675822775Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\"" Jan 28 01:20:22.676303 containerd[2557]: time="2026-01-28T01:20:22.676284262Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 28 01:20:23.878811 containerd[2557]: time="2026-01-28T01:20:23.878764079Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:23.881034 containerd[2557]: time="2026-01-28T01:20:23.880920008Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=20152717" Jan 28 01:20:23.883340 containerd[2557]: time="2026-01-28T01:20:23.883318919Z" level=info msg="ImageCreate event name:\"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:23.889233 containerd[2557]: time="2026-01-28T01:20:23.889197903Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"21815154\" in 1.212888535s" Jan 28 01:20:23.889233 containerd[2557]: time="2026-01-28T01:20:23.889235960Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\"" Jan 28 01:20:23.889586 containerd[2557]: time="2026-01-28T01:20:23.889534514Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:23.889806 containerd[2557]: time="2026-01-28T01:20:23.889792754Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 28 01:20:24.798808 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1655558044.mount: Deactivated successfully. Jan 28 01:20:25.184134 containerd[2557]: time="2026-01-28T01:20:25.184095183Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:25.186138 containerd[2557]: time="2026-01-28T01:20:25.186059245Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=20340589" Jan 28 01:20:25.188171 containerd[2557]: time="2026-01-28T01:20:25.188148434Z" level=info msg="ImageCreate event name:\"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:25.190915 containerd[2557]: time="2026-01-28T01:20:25.190891526Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:25.191312 containerd[2557]: time="2026-01-28T01:20:25.191169685Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"31929115\" in 1.301353394s" Jan 28 01:20:25.191312 containerd[2557]: time="2026-01-28T01:20:25.191196153Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\"" Jan 28 01:20:25.191741 containerd[2557]: time="2026-01-28T01:20:25.191718176Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 28 01:20:25.843375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3612579227.mount: Deactivated successfully. Jan 28 01:20:25.927166 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 28 01:20:25.929479 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:20:26.354318 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:26.353000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:26.355611 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 28 01:20:26.355662 kernel: audit: type=1130 audit(1769563226.353:311): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:26.365252 (kubelet)[3353]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 01:20:26.392908 kubelet[3353]: E0128 01:20:26.392874 3353 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 01:20:26.566745 kernel: audit: type=1131 audit(1769563226.393:312): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:20:26.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:20:26.394332 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 01:20:26.394426 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 01:20:26.394726 systemd[1]: kubelet.service: Consumed 129ms CPU time, 108.3M memory peak. Jan 28 01:20:27.088549 containerd[2557]: time="2026-01-28T01:20:27.088502727Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:27.090644 containerd[2557]: time="2026-01-28T01:20:27.090473902Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20251609" Jan 28 01:20:27.093263 containerd[2557]: time="2026-01-28T01:20:27.093240923Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:27.096410 containerd[2557]: time="2026-01-28T01:20:27.096381854Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:27.097296 containerd[2557]: time="2026-01-28T01:20:27.097271009Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.905529398s" Jan 28 01:20:27.097348 containerd[2557]: time="2026-01-28T01:20:27.097295948Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Jan 28 01:20:27.097889 containerd[2557]: time="2026-01-28T01:20:27.097867872Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 28 01:20:27.660835 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount566787843.mount: Deactivated successfully. Jan 28 01:20:27.676037 containerd[2557]: time="2026-01-28T01:20:27.675996098Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 01:20:27.678009 containerd[2557]: time="2026-01-28T01:20:27.677986630Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 28 01:20:27.707978 containerd[2557]: time="2026-01-28T01:20:27.707724846Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 01:20:27.711776 containerd[2557]: time="2026-01-28T01:20:27.711245564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 01:20:27.711776 containerd[2557]: time="2026-01-28T01:20:27.711672238Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 613.770793ms" Jan 28 01:20:27.711776 containerd[2557]: time="2026-01-28T01:20:27.711694458Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 28 01:20:27.712343 containerd[2557]: time="2026-01-28T01:20:27.712300391Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 28 01:20:28.321011 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount582066599.mount: Deactivated successfully. Jan 28 01:20:30.011756 containerd[2557]: time="2026-01-28T01:20:30.011710717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:30.013966 containerd[2557]: time="2026-01-28T01:20:30.013932951Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=56977083" Jan 28 01:20:30.019023 containerd[2557]: time="2026-01-28T01:20:30.019000786Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:30.021965 containerd[2557]: time="2026-01-28T01:20:30.021914317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:30.022723 containerd[2557]: time="2026-01-28T01:20:30.022615040Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.310278365s" Jan 28 01:20:30.022723 containerd[2557]: time="2026-01-28T01:20:30.022641593Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Jan 28 01:20:32.378277 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:32.377000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:32.378694 systemd[1]: kubelet.service: Consumed 129ms CPU time, 108.3M memory peak. Jan 28 01:20:32.383182 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:20:32.377000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:32.397086 kernel: audit: type=1130 audit(1769563232.377:313): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:32.397160 kernel: audit: type=1131 audit(1769563232.377:314): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:32.417157 systemd[1]: Reload requested from client PID 3488 ('systemctl') (unit session-10.scope)... Jan 28 01:20:32.417171 systemd[1]: Reloading... Jan 28 01:20:32.517981 zram_generator::config[3538]: No configuration found. Jan 28 01:20:32.703641 systemd[1]: Reloading finished in 286 ms. Jan 28 01:20:32.737296 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 28 01:20:32.737360 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 28 01:20:32.737620 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:32.737667 systemd[1]: kubelet.service: Consumed 72ms CPU time, 74.4M memory peak. Jan 28 01:20:32.736000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:20:32.742000 audit: BPF prog-id=87 op=LOAD Jan 28 01:20:32.742772 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:20:32.745967 kernel: audit: type=1130 audit(1769563232.736:315): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 01:20:32.746030 kernel: audit: type=1334 audit(1769563232.742:316): prog-id=87 op=LOAD Jan 28 01:20:32.742000 audit: BPF prog-id=74 op=UNLOAD Jan 28 01:20:32.742000 audit: BPF prog-id=88 op=LOAD Jan 28 01:20:32.749571 kernel: audit: type=1334 audit(1769563232.742:317): prog-id=74 op=UNLOAD Jan 28 01:20:32.749605 kernel: audit: type=1334 audit(1769563232.742:318): prog-id=88 op=LOAD Jan 28 01:20:32.742000 audit: BPF prog-id=89 op=LOAD Jan 28 01:20:32.750812 kernel: audit: type=1334 audit(1769563232.742:319): prog-id=89 op=LOAD Jan 28 01:20:32.742000 audit: BPF prog-id=75 op=UNLOAD Jan 28 01:20:32.753971 kernel: audit: type=1334 audit(1769563232.742:320): prog-id=75 op=UNLOAD Jan 28 01:20:32.754024 kernel: audit: type=1334 audit(1769563232.742:321): prog-id=76 op=UNLOAD Jan 28 01:20:32.742000 audit: BPF prog-id=76 op=UNLOAD Jan 28 01:20:32.745000 audit: BPF prog-id=90 op=LOAD Jan 28 01:20:32.757674 kernel: audit: type=1334 audit(1769563232.745:322): prog-id=90 op=LOAD Jan 28 01:20:32.745000 audit: BPF prog-id=91 op=LOAD Jan 28 01:20:32.745000 audit: BPF prog-id=68 op=UNLOAD Jan 28 01:20:32.745000 audit: BPF prog-id=69 op=UNLOAD Jan 28 01:20:32.745000 audit: BPF prog-id=92 op=LOAD Jan 28 01:20:32.745000 audit: BPF prog-id=73 op=UNLOAD Jan 28 01:20:32.746000 audit: BPF prog-id=93 op=LOAD Jan 28 01:20:32.746000 audit: BPF prog-id=84 op=UNLOAD Jan 28 01:20:32.746000 audit: BPF prog-id=94 op=LOAD Jan 28 01:20:32.746000 audit: BPF prog-id=95 op=LOAD Jan 28 01:20:32.746000 audit: BPF prog-id=85 op=UNLOAD Jan 28 01:20:32.746000 audit: BPF prog-id=86 op=UNLOAD Jan 28 01:20:32.752000 audit: BPF prog-id=96 op=LOAD Jan 28 01:20:32.752000 audit: BPF prog-id=77 op=UNLOAD Jan 28 01:20:32.752000 audit: BPF prog-id=97 op=LOAD Jan 28 01:20:32.752000 audit: BPF prog-id=98 op=LOAD Jan 28 01:20:32.752000 audit: BPF prog-id=78 op=UNLOAD Jan 28 01:20:32.752000 audit: BPF prog-id=79 op=UNLOAD Jan 28 01:20:32.752000 audit: BPF prog-id=99 op=LOAD Jan 28 01:20:32.752000 audit: BPF prog-id=80 op=UNLOAD Jan 28 01:20:32.752000 audit: BPF prog-id=100 op=LOAD Jan 28 01:20:32.752000 audit: BPF prog-id=101 op=LOAD Jan 28 01:20:32.752000 audit: BPF prog-id=81 op=UNLOAD Jan 28 01:20:32.752000 audit: BPF prog-id=82 op=UNLOAD Jan 28 01:20:32.752000 audit: BPF prog-id=102 op=LOAD Jan 28 01:20:32.752000 audit: BPF prog-id=70 op=UNLOAD Jan 28 01:20:32.755000 audit: BPF prog-id=103 op=LOAD Jan 28 01:20:32.755000 audit: BPF prog-id=104 op=LOAD Jan 28 01:20:32.755000 audit: BPF prog-id=71 op=UNLOAD Jan 28 01:20:32.755000 audit: BPF prog-id=72 op=UNLOAD Jan 28 01:20:32.755000 audit: BPF prog-id=105 op=LOAD Jan 28 01:20:32.755000 audit: BPF prog-id=67 op=UNLOAD Jan 28 01:20:32.755000 audit: BPF prog-id=106 op=LOAD Jan 28 01:20:32.755000 audit: BPF prog-id=83 op=UNLOAD Jan 28 01:20:33.304000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:33.304872 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:33.311184 (kubelet)[3605]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 28 01:20:33.345419 kubelet[3605]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:20:33.345419 kubelet[3605]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 28 01:20:33.345419 kubelet[3605]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:20:33.345676 kubelet[3605]: I0128 01:20:33.345487 3605 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 01:20:33.608746 kubelet[3605]: I0128 01:20:33.608449 3605 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 28 01:20:33.608746 kubelet[3605]: I0128 01:20:33.608474 3605 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 01:20:33.608927 kubelet[3605]: I0128 01:20:33.608820 3605 server.go:956] "Client rotation is on, will bootstrap in background" Jan 28 01:20:33.633940 kubelet[3605]: E0128 01:20:33.633898 3605 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.8.20:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 28 01:20:33.635970 kubelet[3605]: I0128 01:20:33.635832 3605 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 28 01:20:33.645179 kubelet[3605]: I0128 01:20:33.645162 3605 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 01:20:33.647751 kubelet[3605]: I0128 01:20:33.647731 3605 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 28 01:20:33.647942 kubelet[3605]: I0128 01:20:33.647923 3605 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 01:20:33.648083 kubelet[3605]: I0128 01:20:33.647940 3605 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593.0.0-n-84a137a86c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 01:20:33.648200 kubelet[3605]: I0128 01:20:33.648089 3605 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 01:20:33.648200 kubelet[3605]: I0128 01:20:33.648098 3605 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 01:20:33.648200 kubelet[3605]: I0128 01:20:33.648196 3605 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:20:33.655718 kubelet[3605]: I0128 01:20:33.655704 3605 kubelet.go:480] "Attempting to sync node with API server" Jan 28 01:20:33.655718 kubelet[3605]: I0128 01:20:33.655719 3605 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 01:20:33.655811 kubelet[3605]: I0128 01:20:33.655739 3605 kubelet.go:386] "Adding apiserver pod source" Jan 28 01:20:33.657280 kubelet[3605]: I0128 01:20:33.657149 3605 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 01:20:33.674018 kubelet[3605]: E0128 01:20:33.673987 3605 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.8.20:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4593.0.0-n-84a137a86c&limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 28 01:20:33.674405 kubelet[3605]: E0128 01:20:33.674380 3605 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.8.20:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 28 01:20:33.674488 kubelet[3605]: I0128 01:20:33.674476 3605 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 28 01:20:33.674862 kubelet[3605]: I0128 01:20:33.674842 3605 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 28 01:20:33.675375 kubelet[3605]: W0128 01:20:33.675360 3605 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 28 01:20:33.677121 kubelet[3605]: I0128 01:20:33.677105 3605 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 28 01:20:33.677183 kubelet[3605]: I0128 01:20:33.677149 3605 server.go:1289] "Started kubelet" Jan 28 01:20:33.682180 kubelet[3605]: I0128 01:20:33.682154 3605 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 01:20:33.686969 kubelet[3605]: E0128 01:20:33.685426 3605 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.20:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.20:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4593.0.0-n-84a137a86c.188ec062aeea93d7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4593.0.0-n-84a137a86c,UID:ci-4593.0.0-n-84a137a86c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4593.0.0-n-84a137a86c,},FirstTimestamp:2026-01-28 01:20:33.677120471 +0000 UTC m=+0.362827014,LastTimestamp:2026-01-28 01:20:33.677120471 +0000 UTC m=+0.362827014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593.0.0-n-84a137a86c,}" Jan 28 01:20:33.686000 audit[3618]: NETFILTER_CFG table=mangle:45 family=2 entries=2 op=nft_register_chain pid=3618 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.686000 audit[3618]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd0b6e6cb0 a2=0 a3=0 items=0 ppid=3605 pid=3618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.686000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 28 01:20:33.687000 audit[3619]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_chain pid=3619 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.687000 audit[3619]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda53e66e0 a2=0 a3=0 items=0 ppid=3605 pid=3619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.687000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 28 01:20:33.689476 kubelet[3605]: I0128 01:20:33.689455 3605 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 01:20:33.690124 kubelet[3605]: I0128 01:20:33.690109 3605 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 28 01:20:33.690288 kubelet[3605]: E0128 01:20:33.690275 3605 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593.0.0-n-84a137a86c\" not found" Jan 28 01:20:33.690727 kubelet[3605]: I0128 01:20:33.690713 3605 server.go:317] "Adding debug handlers to kubelet server" Jan 28 01:20:33.690000 audit[3621]: NETFILTER_CFG table=filter:47 family=2 entries=2 op=nft_register_chain pid=3621 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.690000 audit[3621]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff585d8400 a2=0 a3=0 items=0 ppid=3605 pid=3621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.690000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:20:33.692243 kubelet[3605]: I0128 01:20:33.691977 3605 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 28 01:20:33.692243 kubelet[3605]: I0128 01:20:33.692034 3605 reconciler.go:26] "Reconciler: start to sync state" Jan 28 01:20:33.693000 audit[3625]: NETFILTER_CFG table=filter:48 family=2 entries=2 op=nft_register_chain pid=3625 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.693000 audit[3625]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffea1b1e3c0 a2=0 a3=0 items=0 ppid=3605 pid=3625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.693000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:20:33.696196 kubelet[3605]: I0128 01:20:33.695786 3605 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 01:20:33.696282 kubelet[3605]: I0128 01:20:33.696271 3605 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 01:20:33.696574 kubelet[3605]: I0128 01:20:33.696495 3605 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 28 01:20:33.698062 kubelet[3605]: E0128 01:20:33.698032 3605 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.20:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593.0.0-n-84a137a86c?timeout=10s\": dial tcp 10.200.8.20:6443: connect: connection refused" interval="200ms" Jan 28 01:20:33.698233 kubelet[3605]: I0128 01:20:33.698216 3605 factory.go:223] Registration of the systemd container factory successfully Jan 28 01:20:33.698353 kubelet[3605]: I0128 01:20:33.698324 3605 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 28 01:20:33.699735 kubelet[3605]: E0128 01:20:33.699711 3605 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.8.20:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 28 01:20:33.700267 kubelet[3605]: E0128 01:20:33.700251 3605 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 28 01:20:33.700407 kubelet[3605]: I0128 01:20:33.700382 3605 factory.go:223] Registration of the containerd container factory successfully Jan 28 01:20:33.724000 audit[3630]: NETFILTER_CFG table=filter:49 family=2 entries=1 op=nft_register_rule pid=3630 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.724000 audit[3630]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffc10cc47e0 a2=0 a3=0 items=0 ppid=3605 pid=3630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.724000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 28 01:20:33.725727 kubelet[3605]: I0128 01:20:33.725654 3605 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 28 01:20:33.725000 audit[3632]: NETFILTER_CFG table=mangle:50 family=10 entries=2 op=nft_register_chain pid=3632 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:33.725000 audit[3632]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff373cc4a0 a2=0 a3=0 items=0 ppid=3605 pid=3632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.725000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 28 01:20:33.727779 kubelet[3605]: I0128 01:20:33.727561 3605 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 28 01:20:33.727779 kubelet[3605]: I0128 01:20:33.727575 3605 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 28 01:20:33.727779 kubelet[3605]: I0128 01:20:33.727591 3605 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 28 01:20:33.727779 kubelet[3605]: I0128 01:20:33.727603 3605 kubelet.go:2436] "Starting kubelet main sync loop" Jan 28 01:20:33.727779 kubelet[3605]: E0128 01:20:33.727634 3605 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 01:20:33.727000 audit[3635]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=3635 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:33.727000 audit[3635]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff98459d90 a2=0 a3=0 items=0 ppid=3605 pid=3635 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.727000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 28 01:20:33.729204 kubelet[3605]: E0128 01:20:33.729183 3605 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.8.20:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 28 01:20:33.728000 audit[3634]: NETFILTER_CFG table=mangle:52 family=2 entries=1 op=nft_register_chain pid=3634 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.728000 audit[3634]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcf02f40c0 a2=0 a3=0 items=0 ppid=3605 pid=3634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.729489 kubelet[3605]: I0128 01:20:33.729481 3605 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 28 01:20:33.729548 kubelet[3605]: I0128 01:20:33.729542 3605 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 28 01:20:33.729586 kubelet[3605]: I0128 01:20:33.729582 3605 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:20:33.728000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 28 01:20:33.729000 audit[3638]: NETFILTER_CFG table=nat:53 family=10 entries=1 op=nft_register_chain pid=3638 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:33.729000 audit[3638]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcf15b7bf0 a2=0 a3=0 items=0 ppid=3605 pid=3638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.729000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 28 01:20:33.730000 audit[3639]: NETFILTER_CFG table=nat:54 family=2 entries=1 op=nft_register_chain pid=3639 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.730000 audit[3639]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdbd125460 a2=0 a3=0 items=0 ppid=3605 pid=3639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.730000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 28 01:20:33.730000 audit[3640]: NETFILTER_CFG table=filter:55 family=10 entries=1 op=nft_register_chain pid=3640 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:33.730000 audit[3640]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffea80ba970 a2=0 a3=0 items=0 ppid=3605 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.730000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 28 01:20:33.731000 audit[3641]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_chain pid=3641 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:33.731000 audit[3641]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffca9790b50 a2=0 a3=0 items=0 ppid=3605 pid=3641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:33.731000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 28 01:20:33.733843 kubelet[3605]: I0128 01:20:33.733830 3605 policy_none.go:49] "None policy: Start" Jan 28 01:20:33.733882 kubelet[3605]: I0128 01:20:33.733851 3605 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 28 01:20:33.733882 kubelet[3605]: I0128 01:20:33.733861 3605 state_mem.go:35] "Initializing new in-memory state store" Jan 28 01:20:33.740795 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 28 01:20:33.749705 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 28 01:20:33.752458 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 28 01:20:33.763458 kubelet[3605]: E0128 01:20:33.763435 3605 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 28 01:20:33.763579 kubelet[3605]: I0128 01:20:33.763569 3605 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 01:20:33.763612 kubelet[3605]: I0128 01:20:33.763583 3605 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 01:20:33.765174 kubelet[3605]: E0128 01:20:33.765109 3605 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 28 01:20:33.765551 kubelet[3605]: E0128 01:20:33.765285 3605 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4593.0.0-n-84a137a86c\" not found" Jan 28 01:20:33.765688 kubelet[3605]: I0128 01:20:33.765249 3605 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 01:20:33.845672 systemd[1]: Created slice kubepods-burstable-pod7991eea343ca8b773f09d773b61e0fec.slice - libcontainer container kubepods-burstable-pod7991eea343ca8b773f09d773b61e0fec.slice. Jan 28 01:20:33.853446 kubelet[3605]: E0128 01:20:33.853426 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.858215 systemd[1]: Created slice kubepods-burstable-pod48e496c98a1698a750100321ca83521c.slice - libcontainer container kubepods-burstable-pod48e496c98a1698a750100321ca83521c.slice. Jan 28 01:20:33.860375 kubelet[3605]: E0128 01:20:33.860224 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.865442 kubelet[3605]: I0128 01:20:33.865427 3605 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.865709 kubelet[3605]: E0128 01:20:33.865684 3605 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.20:6443/api/v1/nodes\": dial tcp 10.200.8.20:6443: connect: connection refused" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.868492 systemd[1]: Created slice kubepods-burstable-pod0edce0f3fa1b6ab7111214e76c964adc.slice - libcontainer container kubepods-burstable-pod0edce0f3fa1b6ab7111214e76c964adc.slice. Jan 28 01:20:33.869997 kubelet[3605]: E0128 01:20:33.869980 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893174 kubelet[3605]: I0128 01:20:33.893152 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7991eea343ca8b773f09d773b61e0fec-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593.0.0-n-84a137a86c\" (UID: \"7991eea343ca8b773f09d773b61e0fec\") " pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893279 kubelet[3605]: I0128 01:20:33.893251 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-ca-certs\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893329 kubelet[3605]: I0128 01:20:33.893287 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-flexvolume-dir\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893329 kubelet[3605]: I0128 01:20:33.893302 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-k8s-certs\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893329 kubelet[3605]: I0128 01:20:33.893319 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-kubeconfig\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893449 kubelet[3605]: I0128 01:20:33.893337 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0edce0f3fa1b6ab7111214e76c964adc-kubeconfig\") pod \"kube-scheduler-ci-4593.0.0-n-84a137a86c\" (UID: \"0edce0f3fa1b6ab7111214e76c964adc\") " pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893449 kubelet[3605]: I0128 01:20:33.893353 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7991eea343ca8b773f09d773b61e0fec-ca-certs\") pod \"kube-apiserver-ci-4593.0.0-n-84a137a86c\" (UID: \"7991eea343ca8b773f09d773b61e0fec\") " pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893449 kubelet[3605]: I0128 01:20:33.893373 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7991eea343ca8b773f09d773b61e0fec-k8s-certs\") pod \"kube-apiserver-ci-4593.0.0-n-84a137a86c\" (UID: \"7991eea343ca8b773f09d773b61e0fec\") " pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.893449 kubelet[3605]: I0128 01:20:33.893391 3605 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:33.898482 kubelet[3605]: E0128 01:20:33.898464 3605 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.20:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593.0.0-n-84a137a86c?timeout=10s\": dial tcp 10.200.8.20:6443: connect: connection refused" interval="400ms" Jan 28 01:20:34.067455 kubelet[3605]: I0128 01:20:34.067429 3605 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:34.067798 kubelet[3605]: E0128 01:20:34.067776 3605 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.20:6443/api/v1/nodes\": dial tcp 10.200.8.20:6443: connect: connection refused" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:34.155164 containerd[2557]: time="2026-01-28T01:20:34.155075665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593.0.0-n-84a137a86c,Uid:7991eea343ca8b773f09d773b61e0fec,Namespace:kube-system,Attempt:0,}" Jan 28 01:20:34.161499 containerd[2557]: time="2026-01-28T01:20:34.161469118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593.0.0-n-84a137a86c,Uid:48e496c98a1698a750100321ca83521c,Namespace:kube-system,Attempt:0,}" Jan 28 01:20:34.171130 containerd[2557]: time="2026-01-28T01:20:34.171106745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593.0.0-n-84a137a86c,Uid:0edce0f3fa1b6ab7111214e76c964adc,Namespace:kube-system,Attempt:0,}" Jan 28 01:20:34.240968 containerd[2557]: time="2026-01-28T01:20:34.240804801Z" level=info msg="connecting to shim 6e591d1733680316520decdd28fc47b733e05eb305cce1a451381e285069da8e" address="unix:///run/containerd/s/c149eee667e48d225b9c4483771576fdab1d681bda816f3ba04ce5cff5ca2c05" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:20:34.243420 containerd[2557]: time="2026-01-28T01:20:34.243381823Z" level=info msg="connecting to shim 4ef46e9cccd37e002f311e0ef2274938c2160b225d1db28d2cd87453b0348c64" address="unix:///run/containerd/s/426eb18e23a6f92d752994841d9ac889c27cd87728a6bf4f82fae06265ebc5c0" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:20:34.274454 containerd[2557]: time="2026-01-28T01:20:34.274428232Z" level=info msg="connecting to shim 9b0cbeb4ffe4a5e43634fc71e52e0b49fae71c0d7fa828bb5a77979e2e5593d9" address="unix:///run/containerd/s/a26453566f4bca7f536d0c735e49e542c431bf47cf46a754b99f57cacd0cc7a9" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:20:34.275226 systemd[1]: Started cri-containerd-6e591d1733680316520decdd28fc47b733e05eb305cce1a451381e285069da8e.scope - libcontainer container 6e591d1733680316520decdd28fc47b733e05eb305cce1a451381e285069da8e. Jan 28 01:20:34.282470 systemd[1]: Started cri-containerd-4ef46e9cccd37e002f311e0ef2274938c2160b225d1db28d2cd87453b0348c64.scope - libcontainer container 4ef46e9cccd37e002f311e0ef2274938c2160b225d1db28d2cd87453b0348c64. Jan 28 01:20:34.288000 audit: BPF prog-id=107 op=LOAD Jan 28 01:20:34.288000 audit: BPF prog-id=108 op=LOAD Jan 28 01:20:34.288000 audit[3683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3659 pid=3683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.288000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665353931643137333336383033313635323064656364643238666334 Jan 28 01:20:34.289000 audit: BPF prog-id=108 op=UNLOAD Jan 28 01:20:34.289000 audit[3683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3659 pid=3683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665353931643137333336383033313635323064656364643238666334 Jan 28 01:20:34.289000 audit: BPF prog-id=109 op=LOAD Jan 28 01:20:34.289000 audit[3683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3659 pid=3683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665353931643137333336383033313635323064656364643238666334 Jan 28 01:20:34.289000 audit: BPF prog-id=110 op=LOAD Jan 28 01:20:34.289000 audit[3683]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3659 pid=3683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665353931643137333336383033313635323064656364643238666334 Jan 28 01:20:34.289000 audit: BPF prog-id=110 op=UNLOAD Jan 28 01:20:34.289000 audit[3683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3659 pid=3683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665353931643137333336383033313635323064656364643238666334 Jan 28 01:20:34.289000 audit: BPF prog-id=109 op=UNLOAD Jan 28 01:20:34.289000 audit[3683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3659 pid=3683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665353931643137333336383033313635323064656364643238666334 Jan 28 01:20:34.289000 audit: BPF prog-id=111 op=LOAD Jan 28 01:20:34.289000 audit[3683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3659 pid=3683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.289000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665353931643137333336383033313635323064656364643238666334 Jan 28 01:20:34.299135 systemd[1]: Started cri-containerd-9b0cbeb4ffe4a5e43634fc71e52e0b49fae71c0d7fa828bb5a77979e2e5593d9.scope - libcontainer container 9b0cbeb4ffe4a5e43634fc71e52e0b49fae71c0d7fa828bb5a77979e2e5593d9. Jan 28 01:20:34.300124 kubelet[3605]: E0128 01:20:34.300099 3605 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.20:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593.0.0-n-84a137a86c?timeout=10s\": dial tcp 10.200.8.20:6443: connect: connection refused" interval="800ms" Jan 28 01:20:34.303000 audit: BPF prog-id=112 op=LOAD Jan 28 01:20:34.304000 audit: BPF prog-id=113 op=LOAD Jan 28 01:20:34.304000 audit[3693]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3654 pid=3693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663436653963636364333765303032663331316530656632323734 Jan 28 01:20:34.304000 audit: BPF prog-id=113 op=UNLOAD Jan 28 01:20:34.304000 audit[3693]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3654 pid=3693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663436653963636364333765303032663331316530656632323734 Jan 28 01:20:34.304000 audit: BPF prog-id=114 op=LOAD Jan 28 01:20:34.304000 audit[3693]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3654 pid=3693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663436653963636364333765303032663331316530656632323734 Jan 28 01:20:34.304000 audit: BPF prog-id=115 op=LOAD Jan 28 01:20:34.304000 audit[3693]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3654 pid=3693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663436653963636364333765303032663331316530656632323734 Jan 28 01:20:34.304000 audit: BPF prog-id=115 op=UNLOAD Jan 28 01:20:34.304000 audit[3693]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3654 pid=3693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663436653963636364333765303032663331316530656632323734 Jan 28 01:20:34.304000 audit: BPF prog-id=114 op=UNLOAD Jan 28 01:20:34.304000 audit[3693]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3654 pid=3693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663436653963636364333765303032663331316530656632323734 Jan 28 01:20:34.304000 audit: BPF prog-id=116 op=LOAD Jan 28 01:20:34.304000 audit[3693]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3654 pid=3693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663436653963636364333765303032663331316530656632323734 Jan 28 01:20:34.323000 audit: BPF prog-id=117 op=LOAD Jan 28 01:20:34.326000 audit: BPF prog-id=118 op=LOAD Jan 28 01:20:34.326000 audit[3728]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3705 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962306362656234666665346135653433363334666337316535326530 Jan 28 01:20:34.326000 audit: BPF prog-id=118 op=UNLOAD Jan 28 01:20:34.326000 audit[3728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3705 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962306362656234666665346135653433363334666337316535326530 Jan 28 01:20:34.326000 audit: BPF prog-id=119 op=LOAD Jan 28 01:20:34.326000 audit[3728]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=3705 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962306362656234666665346135653433363334666337316535326530 Jan 28 01:20:34.326000 audit: BPF prog-id=120 op=LOAD Jan 28 01:20:34.326000 audit[3728]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=3705 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962306362656234666665346135653433363334666337316535326530 Jan 28 01:20:34.326000 audit: BPF prog-id=120 op=UNLOAD Jan 28 01:20:34.326000 audit[3728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3705 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962306362656234666665346135653433363334666337316535326530 Jan 28 01:20:34.327000 audit: BPF prog-id=119 op=UNLOAD Jan 28 01:20:34.327000 audit[3728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3705 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962306362656234666665346135653433363334666337316535326530 Jan 28 01:20:34.327000 audit: BPF prog-id=121 op=LOAD Jan 28 01:20:34.327000 audit[3728]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=3705 pid=3728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962306362656234666665346135653433363334666337316535326530 Jan 28 01:20:34.333434 containerd[2557]: time="2026-01-28T01:20:34.333375537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593.0.0-n-84a137a86c,Uid:7991eea343ca8b773f09d773b61e0fec,Namespace:kube-system,Attempt:0,} returns sandbox id \"6e591d1733680316520decdd28fc47b733e05eb305cce1a451381e285069da8e\"" Jan 28 01:20:34.354964 containerd[2557]: time="2026-01-28T01:20:34.354842121Z" level=info msg="CreateContainer within sandbox \"6e591d1733680316520decdd28fc47b733e05eb305cce1a451381e285069da8e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 28 01:20:34.355930 containerd[2557]: time="2026-01-28T01:20:34.355894471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593.0.0-n-84a137a86c,Uid:48e496c98a1698a750100321ca83521c,Namespace:kube-system,Attempt:0,} returns sandbox id \"4ef46e9cccd37e002f311e0ef2274938c2160b225d1db28d2cd87453b0348c64\"" Jan 28 01:20:34.364055 containerd[2557]: time="2026-01-28T01:20:34.363979959Z" level=info msg="CreateContainer within sandbox \"4ef46e9cccd37e002f311e0ef2274938c2160b225d1db28d2cd87453b0348c64\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 28 01:20:34.366365 containerd[2557]: time="2026-01-28T01:20:34.366333588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593.0.0-n-84a137a86c,Uid:0edce0f3fa1b6ab7111214e76c964adc,Namespace:kube-system,Attempt:0,} returns sandbox id \"9b0cbeb4ffe4a5e43634fc71e52e0b49fae71c0d7fa828bb5a77979e2e5593d9\"" Jan 28 01:20:34.371723 containerd[2557]: time="2026-01-28T01:20:34.371703316Z" level=info msg="CreateContainer within sandbox \"9b0cbeb4ffe4a5e43634fc71e52e0b49fae71c0d7fa828bb5a77979e2e5593d9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 28 01:20:34.378234 containerd[2557]: time="2026-01-28T01:20:34.378211620Z" level=info msg="Container 4b0f8fdc7bd37075e32c4501b12e487a5d2a4025f5f63b88fcf8e51e41293a89: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:34.383654 containerd[2557]: time="2026-01-28T01:20:34.383630276Z" level=info msg="Container 7e0c1af1f0bbdb941f6ee344689a16aded71e02f301b74498abd9f4e2a8e4e64: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:34.404498 containerd[2557]: time="2026-01-28T01:20:34.404473021Z" level=info msg="Container 9c7669773092f2f614d7da65a6e0085fe5a29436b5a3eba06ab0a565fddd68ae: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:34.418790 containerd[2557]: time="2026-01-28T01:20:34.418718440Z" level=info msg="CreateContainer within sandbox \"6e591d1733680316520decdd28fc47b733e05eb305cce1a451381e285069da8e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4b0f8fdc7bd37075e32c4501b12e487a5d2a4025f5f63b88fcf8e51e41293a89\"" Jan 28 01:20:34.420197 containerd[2557]: time="2026-01-28T01:20:34.420176069Z" level=info msg="StartContainer for \"4b0f8fdc7bd37075e32c4501b12e487a5d2a4025f5f63b88fcf8e51e41293a89\"" Jan 28 01:20:34.427054 containerd[2557]: time="2026-01-28T01:20:34.427018273Z" level=info msg="connecting to shim 4b0f8fdc7bd37075e32c4501b12e487a5d2a4025f5f63b88fcf8e51e41293a89" address="unix:///run/containerd/s/c149eee667e48d225b9c4483771576fdab1d681bda816f3ba04ce5cff5ca2c05" protocol=ttrpc version=3 Jan 28 01:20:34.429022 containerd[2557]: time="2026-01-28T01:20:34.428991269Z" level=info msg="CreateContainer within sandbox \"4ef46e9cccd37e002f311e0ef2274938c2160b225d1db28d2cd87453b0348c64\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7e0c1af1f0bbdb941f6ee344689a16aded71e02f301b74498abd9f4e2a8e4e64\"" Jan 28 01:20:34.429788 containerd[2557]: time="2026-01-28T01:20:34.429733370Z" level=info msg="StartContainer for \"7e0c1af1f0bbdb941f6ee344689a16aded71e02f301b74498abd9f4e2a8e4e64\"" Jan 28 01:20:34.435723 containerd[2557]: time="2026-01-28T01:20:34.435252301Z" level=info msg="connecting to shim 7e0c1af1f0bbdb941f6ee344689a16aded71e02f301b74498abd9f4e2a8e4e64" address="unix:///run/containerd/s/426eb18e23a6f92d752994841d9ac889c27cd87728a6bf4f82fae06265ebc5c0" protocol=ttrpc version=3 Jan 28 01:20:34.437429 containerd[2557]: time="2026-01-28T01:20:34.437408324Z" level=info msg="CreateContainer within sandbox \"9b0cbeb4ffe4a5e43634fc71e52e0b49fae71c0d7fa828bb5a77979e2e5593d9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9c7669773092f2f614d7da65a6e0085fe5a29436b5a3eba06ab0a565fddd68ae\"" Jan 28 01:20:34.438711 containerd[2557]: time="2026-01-28T01:20:34.438692435Z" level=info msg="StartContainer for \"9c7669773092f2f614d7da65a6e0085fe5a29436b5a3eba06ab0a565fddd68ae\"" Jan 28 01:20:34.439964 containerd[2557]: time="2026-01-28T01:20:34.439645146Z" level=info msg="connecting to shim 9c7669773092f2f614d7da65a6e0085fe5a29436b5a3eba06ab0a565fddd68ae" address="unix:///run/containerd/s/a26453566f4bca7f536d0c735e49e542c431bf47cf46a754b99f57cacd0cc7a9" protocol=ttrpc version=3 Jan 28 01:20:34.442910 systemd[1]: Started cri-containerd-4b0f8fdc7bd37075e32c4501b12e487a5d2a4025f5f63b88fcf8e51e41293a89.scope - libcontainer container 4b0f8fdc7bd37075e32c4501b12e487a5d2a4025f5f63b88fcf8e51e41293a89. Jan 28 01:20:34.455145 systemd[1]: Started cri-containerd-7e0c1af1f0bbdb941f6ee344689a16aded71e02f301b74498abd9f4e2a8e4e64.scope - libcontainer container 7e0c1af1f0bbdb941f6ee344689a16aded71e02f301b74498abd9f4e2a8e4e64. Jan 28 01:20:34.463265 systemd[1]: Started cri-containerd-9c7669773092f2f614d7da65a6e0085fe5a29436b5a3eba06ab0a565fddd68ae.scope - libcontainer container 9c7669773092f2f614d7da65a6e0085fe5a29436b5a3eba06ab0a565fddd68ae. Jan 28 01:20:34.464000 audit: BPF prog-id=122 op=LOAD Jan 28 01:20:34.465000 audit: BPF prog-id=123 op=LOAD Jan 28 01:20:34.465000 audit[3785]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3659 pid=3785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462306638666463376264333730373565333263343530316231326534 Jan 28 01:20:34.465000 audit: BPF prog-id=123 op=UNLOAD Jan 28 01:20:34.465000 audit[3785]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3659 pid=3785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462306638666463376264333730373565333263343530316231326534 Jan 28 01:20:34.465000 audit: BPF prog-id=124 op=LOAD Jan 28 01:20:34.465000 audit[3785]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3659 pid=3785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462306638666463376264333730373565333263343530316231326534 Jan 28 01:20:34.465000 audit: BPF prog-id=125 op=LOAD Jan 28 01:20:34.465000 audit[3785]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3659 pid=3785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462306638666463376264333730373565333263343530316231326534 Jan 28 01:20:34.465000 audit: BPF prog-id=125 op=UNLOAD Jan 28 01:20:34.465000 audit[3785]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3659 pid=3785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462306638666463376264333730373565333263343530316231326534 Jan 28 01:20:34.465000 audit: BPF prog-id=124 op=UNLOAD Jan 28 01:20:34.465000 audit[3785]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3659 pid=3785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462306638666463376264333730373565333263343530316231326534 Jan 28 01:20:34.465000 audit: BPF prog-id=126 op=LOAD Jan 28 01:20:34.465000 audit[3785]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3659 pid=3785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.465000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462306638666463376264333730373565333263343530316231326534 Jan 28 01:20:34.470239 kubelet[3605]: I0128 01:20:34.469897 3605 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:34.470239 kubelet[3605]: E0128 01:20:34.470215 3605 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.20:6443/api/v1/nodes\": dial tcp 10.200.8.20:6443: connect: connection refused" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:34.472000 audit: BPF prog-id=127 op=LOAD Jan 28 01:20:34.472000 audit: BPF prog-id=128 op=LOAD Jan 28 01:20:34.472000 audit[3797]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3654 pid=3797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.472000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765306331616631663062626462393431663665653334343638396131 Jan 28 01:20:34.473000 audit: BPF prog-id=128 op=UNLOAD Jan 28 01:20:34.473000 audit[3797]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3654 pid=3797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765306331616631663062626462393431663665653334343638396131 Jan 28 01:20:34.473000 audit: BPF prog-id=129 op=LOAD Jan 28 01:20:34.473000 audit[3797]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3654 pid=3797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765306331616631663062626462393431663665653334343638396131 Jan 28 01:20:34.473000 audit: BPF prog-id=130 op=LOAD Jan 28 01:20:34.473000 audit[3797]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3654 pid=3797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765306331616631663062626462393431663665653334343638396131 Jan 28 01:20:34.473000 audit: BPF prog-id=130 op=UNLOAD Jan 28 01:20:34.473000 audit[3797]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3654 pid=3797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765306331616631663062626462393431663665653334343638396131 Jan 28 01:20:34.474000 audit: BPF prog-id=129 op=UNLOAD Jan 28 01:20:34.474000 audit[3797]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3654 pid=3797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.474000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765306331616631663062626462393431663665653334343638396131 Jan 28 01:20:34.474000 audit: BPF prog-id=131 op=LOAD Jan 28 01:20:34.474000 audit[3797]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3654 pid=3797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.474000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765306331616631663062626462393431663665653334343638396131 Jan 28 01:20:34.478000 audit: BPF prog-id=132 op=LOAD Jan 28 01:20:34.479000 audit: BPF prog-id=133 op=LOAD Jan 28 01:20:34.479000 audit[3798]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=3705 pid=3798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963373636393737333039326632663631346437646136356136653030 Jan 28 01:20:34.479000 audit: BPF prog-id=133 op=UNLOAD Jan 28 01:20:34.479000 audit[3798]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3705 pid=3798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963373636393737333039326632663631346437646136356136653030 Jan 28 01:20:34.479000 audit: BPF prog-id=134 op=LOAD Jan 28 01:20:34.479000 audit[3798]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3705 pid=3798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963373636393737333039326632663631346437646136356136653030 Jan 28 01:20:34.479000 audit: BPF prog-id=135 op=LOAD Jan 28 01:20:34.479000 audit[3798]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3705 pid=3798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963373636393737333039326632663631346437646136356136653030 Jan 28 01:20:34.479000 audit: BPF prog-id=135 op=UNLOAD Jan 28 01:20:34.479000 audit[3798]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3705 pid=3798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963373636393737333039326632663631346437646136356136653030 Jan 28 01:20:34.479000 audit: BPF prog-id=134 op=UNLOAD Jan 28 01:20:34.479000 audit[3798]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3705 pid=3798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963373636393737333039326632663631346437646136356136653030 Jan 28 01:20:34.479000 audit: BPF prog-id=136 op=LOAD Jan 28 01:20:34.479000 audit[3798]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3705 pid=3798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:34.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963373636393737333039326632663631346437646136356136653030 Jan 28 01:20:34.520756 containerd[2557]: time="2026-01-28T01:20:34.520292631Z" level=info msg="StartContainer for \"4b0f8fdc7bd37075e32c4501b12e487a5d2a4025f5f63b88fcf8e51e41293a89\" returns successfully" Jan 28 01:20:34.542197 containerd[2557]: time="2026-01-28T01:20:34.542080514Z" level=info msg="StartContainer for \"7e0c1af1f0bbdb941f6ee344689a16aded71e02f301b74498abd9f4e2a8e4e64\" returns successfully" Jan 28 01:20:34.556977 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Jan 28 01:20:34.559375 containerd[2557]: time="2026-01-28T01:20:34.559114082Z" level=info msg="StartContainer for \"9c7669773092f2f614d7da65a6e0085fe5a29436b5a3eba06ab0a565fddd68ae\" returns successfully" Jan 28 01:20:34.737129 kubelet[3605]: E0128 01:20:34.737103 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:34.740698 kubelet[3605]: E0128 01:20:34.740680 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:34.742719 kubelet[3605]: E0128 01:20:34.742701 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:35.274119 kubelet[3605]: I0128 01:20:35.274092 3605 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:35.745342 kubelet[3605]: E0128 01:20:35.745175 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:35.747971 kubelet[3605]: E0128 01:20:35.745729 3605 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.804555 kubelet[3605]: E0128 01:20:36.804517 3605 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4593.0.0-n-84a137a86c\" not found" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.859487 kubelet[3605]: I0128 01:20:36.859456 3605 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.859487 kubelet[3605]: E0128 01:20:36.859490 3605 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4593.0.0-n-84a137a86c\": node \"ci-4593.0.0-n-84a137a86c\" not found" Jan 28 01:20:36.890709 kubelet[3605]: I0128 01:20:36.890681 3605 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.922964 kubelet[3605]: E0128 01:20:36.922867 3605 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593.0.0-n-84a137a86c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.922964 kubelet[3605]: I0128 01:20:36.922890 3605 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.930318 kubelet[3605]: E0128 01:20:36.930160 3605 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593.0.0-n-84a137a86c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.930318 kubelet[3605]: I0128 01:20:36.930180 3605 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:36.935128 kubelet[3605]: E0128 01:20:36.935110 3605 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:37.676301 kubelet[3605]: I0128 01:20:37.676267 3605 apiserver.go:52] "Watching apiserver" Jan 28 01:20:37.692323 kubelet[3605]: I0128 01:20:37.692284 3605 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 28 01:20:38.712128 update_engine[2537]: I20260128 01:20:38.712074 2537 update_attempter.cc:509] Updating boot flags... Jan 28 01:20:38.774109 systemd[1]: Reload requested from client PID 3907 ('systemctl') (unit session-10.scope)... Jan 28 01:20:38.774133 systemd[1]: Reloading... Jan 28 01:20:38.901987 zram_generator::config[3973]: No configuration found. Jan 28 01:20:39.079681 systemd[1]: Reloading finished in 305 ms. Jan 28 01:20:39.148624 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:20:39.169593 systemd[1]: kubelet.service: Deactivated successfully. Jan 28 01:20:39.169826 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:39.168000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:39.170582 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 28 01:20:39.170616 kernel: audit: type=1131 audit(1769563239.168:417): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:39.170676 systemd[1]: kubelet.service: Consumed 625ms CPU time, 130.9M memory peak. Jan 28 01:20:39.177530 kernel: audit: type=1334 audit(1769563239.174:418): prog-id=137 op=LOAD Jan 28 01:20:39.177598 kernel: audit: type=1334 audit(1769563239.174:419): prog-id=92 op=UNLOAD Jan 28 01:20:39.174000 audit: BPF prog-id=137 op=LOAD Jan 28 01:20:39.174000 audit: BPF prog-id=92 op=UNLOAD Jan 28 01:20:39.175171 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 01:20:39.180054 kernel: audit: type=1334 audit(1769563239.176:420): prog-id=138 op=LOAD Jan 28 01:20:39.176000 audit: BPF prog-id=138 op=LOAD Jan 28 01:20:39.182766 kernel: audit: type=1334 audit(1769563239.176:421): prog-id=96 op=UNLOAD Jan 28 01:20:39.176000 audit: BPF prog-id=96 op=UNLOAD Jan 28 01:20:39.176000 audit: BPF prog-id=139 op=LOAD Jan 28 01:20:39.176000 audit: BPF prog-id=140 op=LOAD Jan 28 01:20:39.176000 audit: BPF prog-id=97 op=UNLOAD Jan 28 01:20:39.176000 audit: BPF prog-id=98 op=UNLOAD Jan 28 01:20:39.182965 kernel: audit: type=1334 audit(1769563239.176:422): prog-id=139 op=LOAD Jan 28 01:20:39.182980 kernel: audit: type=1334 audit(1769563239.176:423): prog-id=140 op=LOAD Jan 28 01:20:39.182994 kernel: audit: type=1334 audit(1769563239.176:424): prog-id=97 op=UNLOAD Jan 28 01:20:39.183010 kernel: audit: type=1334 audit(1769563239.176:425): prog-id=98 op=UNLOAD Jan 28 01:20:39.180000 audit: BPF prog-id=141 op=LOAD Jan 28 01:20:39.180000 audit: BPF prog-id=106 op=UNLOAD Jan 28 01:20:39.181000 audit: BPF prog-id=142 op=LOAD Jan 28 01:20:39.181000 audit: BPF prog-id=102 op=UNLOAD Jan 28 01:20:39.181000 audit: BPF prog-id=143 op=LOAD Jan 28 01:20:39.181000 audit: BPF prog-id=144 op=LOAD Jan 28 01:20:39.181000 audit: BPF prog-id=103 op=UNLOAD Jan 28 01:20:39.181000 audit: BPF prog-id=104 op=UNLOAD Jan 28 01:20:39.182000 audit: BPF prog-id=145 op=LOAD Jan 28 01:20:39.182000 audit: BPF prog-id=99 op=UNLOAD Jan 28 01:20:39.184784 kernel: audit: type=1334 audit(1769563239.180:426): prog-id=141 op=LOAD Jan 28 01:20:39.182000 audit: BPF prog-id=146 op=LOAD Jan 28 01:20:39.182000 audit: BPF prog-id=147 op=LOAD Jan 28 01:20:39.182000 audit: BPF prog-id=100 op=UNLOAD Jan 28 01:20:39.182000 audit: BPF prog-id=101 op=UNLOAD Jan 28 01:20:39.183000 audit: BPF prog-id=148 op=LOAD Jan 28 01:20:39.183000 audit: BPF prog-id=105 op=UNLOAD Jan 28 01:20:39.183000 audit: BPF prog-id=149 op=LOAD Jan 28 01:20:39.183000 audit: BPF prog-id=93 op=UNLOAD Jan 28 01:20:39.184000 audit: BPF prog-id=150 op=LOAD Jan 28 01:20:39.184000 audit: BPF prog-id=151 op=LOAD Jan 28 01:20:39.184000 audit: BPF prog-id=94 op=UNLOAD Jan 28 01:20:39.184000 audit: BPF prog-id=95 op=UNLOAD Jan 28 01:20:39.184000 audit: BPF prog-id=152 op=LOAD Jan 28 01:20:39.184000 audit: BPF prog-id=153 op=LOAD Jan 28 01:20:39.184000 audit: BPF prog-id=90 op=UNLOAD Jan 28 01:20:39.184000 audit: BPF prog-id=91 op=UNLOAD Jan 28 01:20:39.185000 audit: BPF prog-id=154 op=LOAD Jan 28 01:20:39.185000 audit: BPF prog-id=87 op=UNLOAD Jan 28 01:20:39.185000 audit: BPF prog-id=155 op=LOAD Jan 28 01:20:39.185000 audit: BPF prog-id=156 op=LOAD Jan 28 01:20:39.185000 audit: BPF prog-id=88 op=UNLOAD Jan 28 01:20:39.185000 audit: BPF prog-id=89 op=UNLOAD Jan 28 01:20:39.647176 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 01:20:39.648000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:39.653179 (kubelet)[4038]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 28 01:20:39.688859 kubelet[4038]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:20:39.688859 kubelet[4038]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 28 01:20:39.688859 kubelet[4038]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 01:20:39.689162 kubelet[4038]: I0128 01:20:39.688905 4038 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 01:20:39.694934 kubelet[4038]: I0128 01:20:39.694669 4038 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 28 01:20:39.694934 kubelet[4038]: I0128 01:20:39.694688 4038 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 01:20:39.694934 kubelet[4038]: I0128 01:20:39.694854 4038 server.go:956] "Client rotation is on, will bootstrap in background" Jan 28 01:20:39.696723 kubelet[4038]: I0128 01:20:39.696440 4038 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 28 01:20:39.699000 kubelet[4038]: I0128 01:20:39.698982 4038 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 28 01:20:39.704454 kubelet[4038]: I0128 01:20:39.704105 4038 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 01:20:39.708973 kubelet[4038]: I0128 01:20:39.708537 4038 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 28 01:20:39.708973 kubelet[4038]: I0128 01:20:39.708705 4038 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 01:20:39.708973 kubelet[4038]: I0128 01:20:39.708723 4038 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593.0.0-n-84a137a86c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 01:20:39.708973 kubelet[4038]: I0128 01:20:39.708938 4038 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 01:20:39.709172 kubelet[4038]: I0128 01:20:39.709165 4038 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 01:20:39.709393 kubelet[4038]: I0128 01:20:39.709381 4038 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:20:39.709520 kubelet[4038]: I0128 01:20:39.709511 4038 kubelet.go:480] "Attempting to sync node with API server" Jan 28 01:20:39.709550 kubelet[4038]: I0128 01:20:39.709525 4038 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 01:20:39.709550 kubelet[4038]: I0128 01:20:39.709546 4038 kubelet.go:386] "Adding apiserver pod source" Jan 28 01:20:39.709582 kubelet[4038]: I0128 01:20:39.709558 4038 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 01:20:39.712340 kubelet[4038]: I0128 01:20:39.712308 4038 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 28 01:20:39.712969 kubelet[4038]: I0128 01:20:39.712857 4038 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 28 01:20:39.717122 kubelet[4038]: I0128 01:20:39.716891 4038 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 28 01:20:39.717122 kubelet[4038]: I0128 01:20:39.716938 4038 server.go:1289] "Started kubelet" Jan 28 01:20:39.719082 kubelet[4038]: I0128 01:20:39.719068 4038 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 01:20:39.724975 kubelet[4038]: I0128 01:20:39.724480 4038 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 28 01:20:39.727196 kubelet[4038]: I0128 01:20:39.727160 4038 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 01:20:39.727893 kubelet[4038]: I0128 01:20:39.727875 4038 server.go:317] "Adding debug handlers to kubelet server" Jan 28 01:20:39.732478 kubelet[4038]: I0128 01:20:39.731050 4038 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 01:20:39.732478 kubelet[4038]: I0128 01:20:39.731515 4038 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 01:20:39.732478 kubelet[4038]: I0128 01:20:39.731684 4038 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 28 01:20:39.733305 kubelet[4038]: I0128 01:20:39.733292 4038 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 28 01:20:39.733936 kubelet[4038]: E0128 01:20:39.733462 4038 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593.0.0-n-84a137a86c\" not found" Jan 28 01:20:39.737837 kubelet[4038]: I0128 01:20:39.737816 4038 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 28 01:20:39.737917 kubelet[4038]: I0128 01:20:39.737903 4038 reconciler.go:26] "Reconciler: start to sync state" Jan 28 01:20:39.739979 kubelet[4038]: I0128 01:20:39.739440 4038 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 28 01:20:39.739979 kubelet[4038]: I0128 01:20:39.739458 4038 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 28 01:20:39.739979 kubelet[4038]: I0128 01:20:39.739471 4038 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 28 01:20:39.739979 kubelet[4038]: I0128 01:20:39.739480 4038 kubelet.go:2436] "Starting kubelet main sync loop" Jan 28 01:20:39.739979 kubelet[4038]: E0128 01:20:39.739512 4038 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 01:20:39.741009 kubelet[4038]: I0128 01:20:39.740362 4038 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 28 01:20:39.743681 kubelet[4038]: I0128 01:20:39.743669 4038 factory.go:223] Registration of the containerd container factory successfully Jan 28 01:20:39.743745 kubelet[4038]: I0128 01:20:39.743741 4038 factory.go:223] Registration of the systemd container factory successfully Jan 28 01:20:39.759150 kubelet[4038]: E0128 01:20:39.759121 4038 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 28 01:20:39.795598 kubelet[4038]: I0128 01:20:39.795588 4038 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 28 01:20:39.795693 kubelet[4038]: I0128 01:20:39.795686 4038 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 28 01:20:39.795733 kubelet[4038]: I0128 01:20:39.795729 4038 state_mem.go:36] "Initialized new in-memory state store" Jan 28 01:20:39.796205 kubelet[4038]: I0128 01:20:39.796194 4038 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 28 01:20:39.796260 kubelet[4038]: I0128 01:20:39.796250 4038 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 28 01:20:39.796287 kubelet[4038]: I0128 01:20:39.796283 4038 policy_none.go:49] "None policy: Start" Jan 28 01:20:39.796321 kubelet[4038]: I0128 01:20:39.796316 4038 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 28 01:20:39.796379 kubelet[4038]: I0128 01:20:39.796366 4038 state_mem.go:35] "Initializing new in-memory state store" Jan 28 01:20:39.796498 kubelet[4038]: I0128 01:20:39.796494 4038 state_mem.go:75] "Updated machine memory state" Jan 28 01:20:39.800209 kubelet[4038]: E0128 01:20:39.800131 4038 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 28 01:20:39.800366 kubelet[4038]: I0128 01:20:39.800346 4038 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 01:20:39.800426 kubelet[4038]: I0128 01:20:39.800406 4038 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 01:20:39.800772 kubelet[4038]: I0128 01:20:39.800736 4038 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 01:20:39.804300 kubelet[4038]: E0128 01:20:39.803516 4038 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 28 01:20:39.840571 kubelet[4038]: I0128 01:20:39.840471 4038 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.840571 kubelet[4038]: I0128 01:20:39.840530 4038 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.840571 kubelet[4038]: I0128 01:20:39.840457 4038 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.847857 kubelet[4038]: I0128 01:20:39.847793 4038 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Jan 28 01:20:39.850913 kubelet[4038]: I0128 01:20:39.850899 4038 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Jan 28 01:20:39.851557 kubelet[4038]: I0128 01:20:39.851545 4038 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Jan 28 01:20:39.902901 kubelet[4038]: I0128 01:20:39.902831 4038 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.913852 kubelet[4038]: I0128 01:20:39.913649 4038 kubelet_node_status.go:124] "Node was previously registered" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.913852 kubelet[4038]: I0128 01:20:39.913696 4038 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.939703 kubelet[4038]: I0128 01:20:39.939688 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7991eea343ca8b773f09d773b61e0fec-ca-certs\") pod \"kube-apiserver-ci-4593.0.0-n-84a137a86c\" (UID: \"7991eea343ca8b773f09d773b61e0fec\") " pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.939820 kubelet[4038]: I0128 01:20:39.939809 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7991eea343ca8b773f09d773b61e0fec-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593.0.0-n-84a137a86c\" (UID: \"7991eea343ca8b773f09d773b61e0fec\") " pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.939904 kubelet[4038]: I0128 01:20:39.939895 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-ca-certs\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.939992 kubelet[4038]: I0128 01:20:39.939983 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-kubeconfig\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.940106 kubelet[4038]: I0128 01:20:39.940058 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.940106 kubelet[4038]: I0128 01:20:39.940076 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0edce0f3fa1b6ab7111214e76c964adc-kubeconfig\") pod \"kube-scheduler-ci-4593.0.0-n-84a137a86c\" (UID: \"0edce0f3fa1b6ab7111214e76c964adc\") " pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.940106 kubelet[4038]: I0128 01:20:39.940091 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7991eea343ca8b773f09d773b61e0fec-k8s-certs\") pod \"kube-apiserver-ci-4593.0.0-n-84a137a86c\" (UID: \"7991eea343ca8b773f09d773b61e0fec\") " pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.940204 kubelet[4038]: I0128 01:20:39.940196 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-flexvolume-dir\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:39.940265 kubelet[4038]: I0128 01:20:39.940258 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/48e496c98a1698a750100321ca83521c-k8s-certs\") pod \"kube-controller-manager-ci-4593.0.0-n-84a137a86c\" (UID: \"48e496c98a1698a750100321ca83521c\") " pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:40.710698 kubelet[4038]: I0128 01:20:40.710666 4038 apiserver.go:52] "Watching apiserver" Jan 28 01:20:40.738773 kubelet[4038]: I0128 01:20:40.738742 4038 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 28 01:20:40.780068 kubelet[4038]: I0128 01:20:40.779550 4038 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:40.788551 kubelet[4038]: I0128 01:20:40.788498 4038 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Jan 28 01:20:40.788740 kubelet[4038]: E0128 01:20:40.788643 4038 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593.0.0-n-84a137a86c\" already exists" pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" Jan 28 01:20:40.803694 kubelet[4038]: I0128 01:20:40.803469 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4593.0.0-n-84a137a86c" podStartSLOduration=1.803455421 podStartE2EDuration="1.803455421s" podCreationTimestamp="2026-01-28 01:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:20:40.79580318 +0000 UTC m=+1.139421841" watchObservedRunningTime="2026-01-28 01:20:40.803455421 +0000 UTC m=+1.147074077" Jan 28 01:20:40.813210 kubelet[4038]: I0128 01:20:40.813139 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4593.0.0-n-84a137a86c" podStartSLOduration=1.813127434 podStartE2EDuration="1.813127434s" podCreationTimestamp="2026-01-28 01:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:20:40.803879012 +0000 UTC m=+1.147497667" watchObservedRunningTime="2026-01-28 01:20:40.813127434 +0000 UTC m=+1.156746122" Jan 28 01:20:40.822271 kubelet[4038]: I0128 01:20:40.822227 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4593.0.0-n-84a137a86c" podStartSLOduration=1.822216005 podStartE2EDuration="1.822216005s" podCreationTimestamp="2026-01-28 01:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:20:40.813542449 +0000 UTC m=+1.157161102" watchObservedRunningTime="2026-01-28 01:20:40.822216005 +0000 UTC m=+1.165834670" Jan 28 01:20:45.345215 kubelet[4038]: I0128 01:20:45.345162 4038 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 28 01:20:45.345644 kubelet[4038]: I0128 01:20:45.345605 4038 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 28 01:20:45.345680 containerd[2557]: time="2026-01-28T01:20:45.345461977Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 28 01:20:45.920184 systemd[1]: Created slice kubepods-besteffort-pode42516f0_ab33_4f5e_9394_62784b4d47b2.slice - libcontainer container kubepods-besteffort-pode42516f0_ab33_4f5e_9394_62784b4d47b2.slice. Jan 28 01:20:45.984005 kubelet[4038]: I0128 01:20:45.983935 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcr2j\" (UniqueName: \"kubernetes.io/projected/e42516f0-ab33-4f5e-9394-62784b4d47b2-kube-api-access-wcr2j\") pod \"kube-proxy-hjwfm\" (UID: \"e42516f0-ab33-4f5e-9394-62784b4d47b2\") " pod="kube-system/kube-proxy-hjwfm" Jan 28 01:20:45.984005 kubelet[4038]: I0128 01:20:45.983980 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e42516f0-ab33-4f5e-9394-62784b4d47b2-kube-proxy\") pod \"kube-proxy-hjwfm\" (UID: \"e42516f0-ab33-4f5e-9394-62784b4d47b2\") " pod="kube-system/kube-proxy-hjwfm" Jan 28 01:20:45.984005 kubelet[4038]: I0128 01:20:45.984005 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e42516f0-ab33-4f5e-9394-62784b4d47b2-xtables-lock\") pod \"kube-proxy-hjwfm\" (UID: \"e42516f0-ab33-4f5e-9394-62784b4d47b2\") " pod="kube-system/kube-proxy-hjwfm" Jan 28 01:20:45.984171 kubelet[4038]: I0128 01:20:45.984018 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e42516f0-ab33-4f5e-9394-62784b4d47b2-lib-modules\") pod \"kube-proxy-hjwfm\" (UID: \"e42516f0-ab33-4f5e-9394-62784b4d47b2\") " pod="kube-system/kube-proxy-hjwfm" Jan 28 01:20:46.227454 containerd[2557]: time="2026-01-28T01:20:46.227414370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hjwfm,Uid:e42516f0-ab33-4f5e-9394-62784b4d47b2,Namespace:kube-system,Attempt:0,}" Jan 28 01:20:46.261380 containerd[2557]: time="2026-01-28T01:20:46.261304046Z" level=info msg="connecting to shim 811cc8562df1043ad4b37093a5e2d3c24c00d795e13be525541303a49eacc1d0" address="unix:///run/containerd/s/aa157418c60a8e3ac830db578826557c00ae2bff766cb522c7601c534a825f3f" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:20:46.303127 systemd[1]: Started cri-containerd-811cc8562df1043ad4b37093a5e2d3c24c00d795e13be525541303a49eacc1d0.scope - libcontainer container 811cc8562df1043ad4b37093a5e2d3c24c00d795e13be525541303a49eacc1d0. Jan 28 01:20:46.333000 audit: BPF prog-id=157 op=LOAD Jan 28 01:20:46.335170 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 28 01:20:46.335252 kernel: audit: type=1334 audit(1769563246.333:459): prog-id=157 op=LOAD Jan 28 01:20:46.334000 audit: BPF prog-id=158 op=LOAD Jan 28 01:20:46.337410 kernel: audit: type=1334 audit(1769563246.334:460): prog-id=158 op=LOAD Jan 28 01:20:46.334000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.343033 kernel: audit: type=1300 audit(1769563246.334:460): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.343298 kernel: audit: type=1327 audit(1769563246.334:460): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: BPF prog-id=158 op=UNLOAD Jan 28 01:20:46.355091 kernel: audit: type=1334 audit(1769563246.334:461): prog-id=158 op=UNLOAD Jan 28 01:20:46.355149 kernel: audit: type=1300 audit(1769563246.334:461): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.334000 audit[4106]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.359156 kernel: audit: type=1327 audit(1769563246.334:461): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: BPF prog-id=159 op=LOAD Jan 28 01:20:46.364930 kernel: audit: type=1334 audit(1769563246.334:462): prog-id=159 op=LOAD Jan 28 01:20:46.365019 kernel: audit: type=1300 audit(1769563246.334:462): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.334000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.334000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.368975 kernel: audit: type=1327 audit(1769563246.334:462): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: BPF prog-id=160 op=LOAD Jan 28 01:20:46.334000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.334000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: BPF prog-id=160 op=UNLOAD Jan 28 01:20:46.334000 audit[4106]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.334000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: BPF prog-id=159 op=UNLOAD Jan 28 01:20:46.334000 audit[4106]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.334000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.334000 audit: BPF prog-id=161 op=LOAD Jan 28 01:20:46.334000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4094 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.334000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831316363383536326466313034336164346233373039336135653264 Jan 28 01:20:46.377493 containerd[2557]: time="2026-01-28T01:20:46.377463889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hjwfm,Uid:e42516f0-ab33-4f5e-9394-62784b4d47b2,Namespace:kube-system,Attempt:0,} returns sandbox id \"811cc8562df1043ad4b37093a5e2d3c24c00d795e13be525541303a49eacc1d0\"" Jan 28 01:20:46.383970 containerd[2557]: time="2026-01-28T01:20:46.383913912Z" level=info msg="CreateContainer within sandbox \"811cc8562df1043ad4b37093a5e2d3c24c00d795e13be525541303a49eacc1d0\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 28 01:20:46.406733 containerd[2557]: time="2026-01-28T01:20:46.406707850Z" level=info msg="Container 33f60f27a878b963b62a961e38415571a10a3026e8942f08218b606af9bcaee6: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:46.565373 containerd[2557]: time="2026-01-28T01:20:46.565279969Z" level=info msg="CreateContainer within sandbox \"811cc8562df1043ad4b37093a5e2d3c24c00d795e13be525541303a49eacc1d0\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"33f60f27a878b963b62a961e38415571a10a3026e8942f08218b606af9bcaee6\"" Jan 28 01:20:46.567899 containerd[2557]: time="2026-01-28T01:20:46.567864676Z" level=info msg="StartContainer for \"33f60f27a878b963b62a961e38415571a10a3026e8942f08218b606af9bcaee6\"" Jan 28 01:20:46.569312 containerd[2557]: time="2026-01-28T01:20:46.569226772Z" level=info msg="connecting to shim 33f60f27a878b963b62a961e38415571a10a3026e8942f08218b606af9bcaee6" address="unix:///run/containerd/s/aa157418c60a8e3ac830db578826557c00ae2bff766cb522c7601c534a825f3f" protocol=ttrpc version=3 Jan 28 01:20:46.586177 systemd[1]: Started cri-containerd-33f60f27a878b963b62a961e38415571a10a3026e8942f08218b606af9bcaee6.scope - libcontainer container 33f60f27a878b963b62a961e38415571a10a3026e8942f08218b606af9bcaee6. Jan 28 01:20:46.636000 audit: BPF prog-id=162 op=LOAD Jan 28 01:20:46.636000 audit[4132]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4094 pid=4132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.636000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333663630663237613837386239363362363261393631653338343135 Jan 28 01:20:46.636000 audit: BPF prog-id=163 op=LOAD Jan 28 01:20:46.636000 audit[4132]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4094 pid=4132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.636000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333663630663237613837386239363362363261393631653338343135 Jan 28 01:20:46.636000 audit: BPF prog-id=163 op=UNLOAD Jan 28 01:20:46.636000 audit[4132]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4094 pid=4132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.636000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333663630663237613837386239363362363261393631653338343135 Jan 28 01:20:46.636000 audit: BPF prog-id=162 op=UNLOAD Jan 28 01:20:46.636000 audit[4132]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4094 pid=4132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.636000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333663630663237613837386239363362363261393631653338343135 Jan 28 01:20:46.636000 audit: BPF prog-id=164 op=LOAD Jan 28 01:20:46.636000 audit[4132]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4094 pid=4132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.636000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333663630663237613837386239363362363261393631653338343135 Jan 28 01:20:46.729314 systemd[1]: Created slice kubepods-besteffort-pod3d81826c_d19f_4ff7_8a41_1ae525b4b4c5.slice - libcontainer container kubepods-besteffort-pod3d81826c_d19f_4ff7_8a41_1ae525b4b4c5.slice. Jan 28 01:20:46.788056 kubelet[4038]: I0128 01:20:46.788020 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qghrq\" (UniqueName: \"kubernetes.io/projected/3d81826c-d19f-4ff7-8a41-1ae525b4b4c5-kube-api-access-qghrq\") pod \"tigera-operator-7dcd859c48-mrvns\" (UID: \"3d81826c-d19f-4ff7-8a41-1ae525b4b4c5\") " pod="tigera-operator/tigera-operator-7dcd859c48-mrvns" Jan 28 01:20:46.788327 kubelet[4038]: I0128 01:20:46.788055 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3d81826c-d19f-4ff7-8a41-1ae525b4b4c5-var-lib-calico\") pod \"tigera-operator-7dcd859c48-mrvns\" (UID: \"3d81826c-d19f-4ff7-8a41-1ae525b4b4c5\") " pod="tigera-operator/tigera-operator-7dcd859c48-mrvns" Jan 28 01:20:46.827000 audit[4194]: NETFILTER_CFG table=mangle:57 family=10 entries=1 op=nft_register_chain pid=4194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:46.827000 audit[4194]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffce9f393a0 a2=0 a3=7ffce9f3938c items=0 ppid=4145 pid=4194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.827000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 28 01:20:46.829000 audit[4198]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=4198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:46.829000 audit[4198]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd623ba140 a2=0 a3=7ffd623ba12c items=0 ppid=4145 pid=4198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.829000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 28 01:20:46.830000 audit[4199]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=4199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:46.830000 audit[4199]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd833c6150 a2=0 a3=7ffd833c613c items=0 ppid=4145 pid=4199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.830000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 28 01:20:46.831000 audit[4200]: NETFILTER_CFG table=mangle:60 family=2 entries=1 op=nft_register_chain pid=4200 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.831000 audit[4200]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff7a1eb4d0 a2=0 a3=7fff7a1eb4bc items=0 ppid=4145 pid=4200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.831000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 28 01:20:46.832000 audit[4201]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=4201 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.832000 audit[4201]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffffcfc2350 a2=0 a3=7ffffcfc233c items=0 ppid=4145 pid=4201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.832000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 28 01:20:46.833000 audit[4202]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_chain pid=4202 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.833000 audit[4202]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff7cd4eea0 a2=0 a3=7fff7cd4ee8c items=0 ppid=4145 pid=4202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.833000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 28 01:20:46.864595 containerd[2557]: time="2026-01-28T01:20:46.864559045Z" level=info msg="StartContainer for \"33f60f27a878b963b62a961e38415571a10a3026e8942f08218b606af9bcaee6\" returns successfully" Jan 28 01:20:46.931000 audit[4204]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=4204 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.931000 audit[4204]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe15819cf0 a2=0 a3=7ffe15819cdc items=0 ppid=4145 pid=4204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.931000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 28 01:20:46.933000 audit[4206]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=4206 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.933000 audit[4206]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe7a2c1920 a2=0 a3=7ffe7a2c190c items=0 ppid=4145 pid=4206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.933000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 28 01:20:46.936000 audit[4209]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_rule pid=4209 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.936000 audit[4209]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe7586a950 a2=0 a3=7ffe7586a93c items=0 ppid=4145 pid=4209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.936000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 28 01:20:46.937000 audit[4210]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_chain pid=4210 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.937000 audit[4210]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff42dff0e0 a2=0 a3=7fff42dff0cc items=0 ppid=4145 pid=4210 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.937000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 28 01:20:46.939000 audit[4212]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=4212 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.939000 audit[4212]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffffc523490 a2=0 a3=7ffffc52347c items=0 ppid=4145 pid=4212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.939000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 28 01:20:46.940000 audit[4213]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=4213 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.940000 audit[4213]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd83788760 a2=0 a3=7ffd8378874c items=0 ppid=4145 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.940000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 28 01:20:46.942000 audit[4215]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=4215 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.942000 audit[4215]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe5c0ea070 a2=0 a3=7ffe5c0ea05c items=0 ppid=4145 pid=4215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.942000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 28 01:20:46.945000 audit[4218]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_rule pid=4218 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.945000 audit[4218]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe5ff59fb0 a2=0 a3=7ffe5ff59f9c items=0 ppid=4145 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.945000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 28 01:20:46.945000 audit[4219]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_chain pid=4219 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.945000 audit[4219]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc9da5b220 a2=0 a3=7ffc9da5b20c items=0 ppid=4145 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.945000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 28 01:20:46.947000 audit[4221]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=4221 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.947000 audit[4221]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc0c80b6f0 a2=0 a3=7ffc0c80b6dc items=0 ppid=4145 pid=4221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.947000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 28 01:20:46.948000 audit[4222]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_chain pid=4222 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.948000 audit[4222]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc4d285600 a2=0 a3=7ffc4d2855ec items=0 ppid=4145 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.948000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 28 01:20:46.950000 audit[4224]: NETFILTER_CFG table=filter:74 family=2 entries=1 op=nft_register_rule pid=4224 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.950000 audit[4224]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc66f9a660 a2=0 a3=7ffc66f9a64c items=0 ppid=4145 pid=4224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.950000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 01:20:46.953000 audit[4227]: NETFILTER_CFG table=filter:75 family=2 entries=1 op=nft_register_rule pid=4227 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.953000 audit[4227]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcb60117f0 a2=0 a3=7ffcb60117dc items=0 ppid=4145 pid=4227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.953000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 01:20:46.956000 audit[4230]: NETFILTER_CFG table=filter:76 family=2 entries=1 op=nft_register_rule pid=4230 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.956000 audit[4230]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdd01d1c30 a2=0 a3=7ffdd01d1c1c items=0 ppid=4145 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.956000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 28 01:20:46.957000 audit[4231]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=4231 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.957000 audit[4231]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc10a5cbe0 a2=0 a3=7ffc10a5cbcc items=0 ppid=4145 pid=4231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.957000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 28 01:20:46.959000 audit[4233]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=4233 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.959000 audit[4233]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffca8d92b60 a2=0 a3=7ffca8d92b4c items=0 ppid=4145 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.959000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:20:46.962000 audit[4236]: NETFILTER_CFG table=nat:79 family=2 entries=1 op=nft_register_rule pid=4236 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.962000 audit[4236]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe369e6690 a2=0 a3=7ffe369e667c items=0 ppid=4145 pid=4236 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.962000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:20:46.963000 audit[4237]: NETFILTER_CFG table=nat:80 family=2 entries=1 op=nft_register_chain pid=4237 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.963000 audit[4237]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd1dbecf70 a2=0 a3=7ffd1dbecf5c items=0 ppid=4145 pid=4237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.963000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 28 01:20:46.965000 audit[4239]: NETFILTER_CFG table=nat:81 family=2 entries=1 op=nft_register_rule pid=4239 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 01:20:46.965000 audit[4239]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffdf5a70070 a2=0 a3=7ffdf5a7005c items=0 ppid=4145 pid=4239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:46.965000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 28 01:20:47.117737 containerd[2557]: time="2026-01-28T01:20:47.117660478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mrvns,Uid:3d81826c-d19f-4ff7-8a41-1ae525b4b4c5,Namespace:tigera-operator,Attempt:0,}" Jan 28 01:20:47.134000 audit[4245]: NETFILTER_CFG table=filter:82 family=2 entries=8 op=nft_register_rule pid=4245 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:47.134000 audit[4245]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffdcb7a73a0 a2=0 a3=7ffdcb7a738c items=0 ppid=4145 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.134000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:47.179000 audit[4245]: NETFILTER_CFG table=nat:83 family=2 entries=14 op=nft_register_chain pid=4245 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:47.179000 audit[4245]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffdcb7a73a0 a2=0 a3=7ffdcb7a738c items=0 ppid=4145 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.179000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:47.181000 audit[4250]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=4250 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.181000 audit[4250]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffc35d51690 a2=0 a3=7ffc35d5167c items=0 ppid=4145 pid=4250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.181000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 28 01:20:47.183000 audit[4252]: NETFILTER_CFG table=filter:85 family=10 entries=2 op=nft_register_chain pid=4252 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.183000 audit[4252]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffe192b5c40 a2=0 a3=7ffe192b5c2c items=0 ppid=4145 pid=4252 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.183000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 28 01:20:47.187000 audit[4255]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_rule pid=4255 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.187000 audit[4255]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc55e75c10 a2=0 a3=7ffc55e75bfc items=0 ppid=4145 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.187000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 28 01:20:47.188000 audit[4256]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_chain pid=4256 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.188000 audit[4256]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc1e1c9fa0 a2=0 a3=7ffc1e1c9f8c items=0 ppid=4145 pid=4256 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.188000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 28 01:20:47.190000 audit[4258]: NETFILTER_CFG table=filter:88 family=10 entries=1 op=nft_register_rule pid=4258 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.190000 audit[4258]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd2c3cc9c0 a2=0 a3=7ffd2c3cc9ac items=0 ppid=4145 pid=4258 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.190000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 28 01:20:47.192000 audit[4259]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=4259 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.192000 audit[4259]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe1d9289c0 a2=0 a3=7ffe1d9289ac items=0 ppid=4145 pid=4259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.192000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 28 01:20:47.194000 audit[4261]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=4261 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.194000 audit[4261]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc2a380490 a2=0 a3=7ffc2a38047c items=0 ppid=4145 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.194000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 28 01:20:47.197000 audit[4264]: NETFILTER_CFG table=filter:91 family=10 entries=2 op=nft_register_chain pid=4264 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.197000 audit[4264]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffe5e564510 a2=0 a3=7ffe5e5644fc items=0 ppid=4145 pid=4264 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.197000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 28 01:20:47.198000 audit[4265]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_chain pid=4265 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.198000 audit[4265]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd074a6be0 a2=0 a3=7ffd074a6bcc items=0 ppid=4145 pid=4265 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.198000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 28 01:20:47.200000 audit[4267]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=4267 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.200000 audit[4267]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc848089b0 a2=0 a3=7ffc8480899c items=0 ppid=4145 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.200000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 28 01:20:47.201000 audit[4268]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_chain pid=4268 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.201000 audit[4268]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe670e3950 a2=0 a3=7ffe670e393c items=0 ppid=4145 pid=4268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.201000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 28 01:20:47.203000 audit[4270]: NETFILTER_CFG table=filter:95 family=10 entries=1 op=nft_register_rule pid=4270 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.203000 audit[4270]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe4c1979d0 a2=0 a3=7ffe4c1979bc items=0 ppid=4145 pid=4270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.203000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 01:20:47.206000 audit[4273]: NETFILTER_CFG table=filter:96 family=10 entries=1 op=nft_register_rule pid=4273 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.206000 audit[4273]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe70e823f0 a2=0 a3=7ffe70e823dc items=0 ppid=4145 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.206000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 28 01:20:47.209000 audit[4276]: NETFILTER_CFG table=filter:97 family=10 entries=1 op=nft_register_rule pid=4276 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.209000 audit[4276]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe5c9b80c0 a2=0 a3=7ffe5c9b80ac items=0 ppid=4145 pid=4276 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.209000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 28 01:20:47.211000 audit[4277]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=4277 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.211000 audit[4277]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff8e96eb70 a2=0 a3=7fff8e96eb5c items=0 ppid=4145 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.211000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 28 01:20:47.213000 audit[4279]: NETFILTER_CFG table=nat:99 family=10 entries=1 op=nft_register_rule pid=4279 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.213000 audit[4279]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffd647ba1d0 a2=0 a3=7ffd647ba1bc items=0 ppid=4145 pid=4279 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.213000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:20:47.216000 audit[4282]: NETFILTER_CFG table=nat:100 family=10 entries=1 op=nft_register_rule pid=4282 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.216000 audit[4282]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcd6ec6800 a2=0 a3=7ffcd6ec67ec items=0 ppid=4145 pid=4282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.216000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 01:20:47.217000 audit[4283]: NETFILTER_CFG table=nat:101 family=10 entries=1 op=nft_register_chain pid=4283 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.217000 audit[4283]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdf2f22590 a2=0 a3=7ffdf2f2257c items=0 ppid=4145 pid=4283 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.217000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 28 01:20:47.219000 audit[4285]: NETFILTER_CFG table=nat:102 family=10 entries=2 op=nft_register_chain pid=4285 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.219000 audit[4285]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffce1b0e340 a2=0 a3=7ffce1b0e32c items=0 ppid=4145 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.219000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 28 01:20:47.220000 audit[4286]: NETFILTER_CFG table=filter:103 family=10 entries=1 op=nft_register_chain pid=4286 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.220000 audit[4286]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffede6172d0 a2=0 a3=7ffede6172bc items=0 ppid=4145 pid=4286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.220000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 28 01:20:47.222000 audit[4288]: NETFILTER_CFG table=filter:104 family=10 entries=1 op=nft_register_rule pid=4288 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.222000 audit[4288]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc467e7be0 a2=0 a3=7ffc467e7bcc items=0 ppid=4145 pid=4288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.222000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:20:47.224000 audit[4291]: NETFILTER_CFG table=filter:105 family=10 entries=1 op=nft_register_rule pid=4291 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 01:20:47.224000 audit[4291]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc66178210 a2=0 a3=7ffc661781fc items=0 ppid=4145 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.224000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 01:20:47.230000 audit[4293]: NETFILTER_CFG table=filter:106 family=10 entries=3 op=nft_register_rule pid=4293 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 28 01:20:47.230000 audit[4293]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffe62f95970 a2=0 a3=7ffe62f9595c items=0 ppid=4145 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.230000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:47.230000 audit[4293]: NETFILTER_CFG table=nat:107 family=10 entries=7 op=nft_register_chain pid=4293 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 28 01:20:47.230000 audit[4293]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffe62f95970 a2=0 a3=7ffe62f9595c items=0 ppid=4145 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.230000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:47.258613 containerd[2557]: time="2026-01-28T01:20:47.258543230Z" level=info msg="connecting to shim 314408ca2d2d5efa23fa06c02ae1ea992d9327b47216e573b0ad49f818d55685" address="unix:///run/containerd/s/239e191609478922dd3adbfc5eb34ceee05bee87bf751128dd2fdba53d54f55d" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:20:47.279128 systemd[1]: Started cri-containerd-314408ca2d2d5efa23fa06c02ae1ea992d9327b47216e573b0ad49f818d55685.scope - libcontainer container 314408ca2d2d5efa23fa06c02ae1ea992d9327b47216e573b0ad49f818d55685. Jan 28 01:20:47.285000 audit: BPF prog-id=165 op=LOAD Jan 28 01:20:47.285000 audit: BPF prog-id=166 op=LOAD Jan 28 01:20:47.285000 audit[4313]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4302 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331343430386361326432643565666132336661303663303261653165 Jan 28 01:20:47.285000 audit: BPF prog-id=166 op=UNLOAD Jan 28 01:20:47.285000 audit[4313]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4302 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331343430386361326432643565666132336661303663303261653165 Jan 28 01:20:47.286000 audit: BPF prog-id=167 op=LOAD Jan 28 01:20:47.286000 audit[4313]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4302 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331343430386361326432643565666132336661303663303261653165 Jan 28 01:20:47.286000 audit: BPF prog-id=168 op=LOAD Jan 28 01:20:47.286000 audit[4313]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4302 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331343430386361326432643565666132336661303663303261653165 Jan 28 01:20:47.286000 audit: BPF prog-id=168 op=UNLOAD Jan 28 01:20:47.286000 audit[4313]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4302 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331343430386361326432643565666132336661303663303261653165 Jan 28 01:20:47.286000 audit: BPF prog-id=167 op=UNLOAD Jan 28 01:20:47.286000 audit[4313]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4302 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331343430386361326432643565666132336661303663303261653165 Jan 28 01:20:47.286000 audit: BPF prog-id=169 op=LOAD Jan 28 01:20:47.286000 audit[4313]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4302 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:47.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331343430386361326432643565666132336661303663303261653165 Jan 28 01:20:47.313262 containerd[2557]: time="2026-01-28T01:20:47.313218288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mrvns,Uid:3d81826c-d19f-4ff7-8a41-1ae525b4b4c5,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"314408ca2d2d5efa23fa06c02ae1ea992d9327b47216e573b0ad49f818d55685\"" Jan 28 01:20:47.314417 containerd[2557]: time="2026-01-28T01:20:47.314398011Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 28 01:20:48.310973 kubelet[4038]: I0128 01:20:48.310102 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hjwfm" podStartSLOduration=3.310086487 podStartE2EDuration="3.310086487s" podCreationTimestamp="2026-01-28 01:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:20:47.88144652 +0000 UTC m=+8.225065184" watchObservedRunningTime="2026-01-28 01:20:48.310086487 +0000 UTC m=+8.653705152" Jan 28 01:20:48.643945 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1492643866.mount: Deactivated successfully. Jan 28 01:20:49.380937 containerd[2557]: time="2026-01-28T01:20:49.380896270Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:49.382849 containerd[2557]: time="2026-01-28T01:20:49.382817424Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 28 01:20:49.385200 containerd[2557]: time="2026-01-28T01:20:49.385171305Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:49.388069 containerd[2557]: time="2026-01-28T01:20:49.388029889Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:20:49.388399 containerd[2557]: time="2026-01-28T01:20:49.388378702Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.073953774s" Jan 28 01:20:49.388434 containerd[2557]: time="2026-01-28T01:20:49.388406907Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 28 01:20:49.394582 containerd[2557]: time="2026-01-28T01:20:49.394547149Z" level=info msg="CreateContainer within sandbox \"314408ca2d2d5efa23fa06c02ae1ea992d9327b47216e573b0ad49f818d55685\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 28 01:20:49.412174 containerd[2557]: time="2026-01-28T01:20:49.412146247Z" level=info msg="Container 0ddaae3f66bbc42eaae33735437ebf98cea99f296c9353ee7346d8544077985b: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:20:49.414552 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount371286088.mount: Deactivated successfully. Jan 28 01:20:49.423445 containerd[2557]: time="2026-01-28T01:20:49.423417664Z" level=info msg="CreateContainer within sandbox \"314408ca2d2d5efa23fa06c02ae1ea992d9327b47216e573b0ad49f818d55685\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0ddaae3f66bbc42eaae33735437ebf98cea99f296c9353ee7346d8544077985b\"" Jan 28 01:20:49.424038 containerd[2557]: time="2026-01-28T01:20:49.423998198Z" level=info msg="StartContainer for \"0ddaae3f66bbc42eaae33735437ebf98cea99f296c9353ee7346d8544077985b\"" Jan 28 01:20:49.424922 containerd[2557]: time="2026-01-28T01:20:49.424898697Z" level=info msg="connecting to shim 0ddaae3f66bbc42eaae33735437ebf98cea99f296c9353ee7346d8544077985b" address="unix:///run/containerd/s/239e191609478922dd3adbfc5eb34ceee05bee87bf751128dd2fdba53d54f55d" protocol=ttrpc version=3 Jan 28 01:20:49.441125 systemd[1]: Started cri-containerd-0ddaae3f66bbc42eaae33735437ebf98cea99f296c9353ee7346d8544077985b.scope - libcontainer container 0ddaae3f66bbc42eaae33735437ebf98cea99f296c9353ee7346d8544077985b. Jan 28 01:20:49.449000 audit: BPF prog-id=170 op=LOAD Jan 28 01:20:49.449000 audit: BPF prog-id=171 op=LOAD Jan 28 01:20:49.449000 audit[4346]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4302 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:49.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064646161653366363662626334326561616533333733353433376562 Jan 28 01:20:49.449000 audit: BPF prog-id=171 op=UNLOAD Jan 28 01:20:49.449000 audit[4346]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4302 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:49.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064646161653366363662626334326561616533333733353433376562 Jan 28 01:20:49.449000 audit: BPF prog-id=172 op=LOAD Jan 28 01:20:49.449000 audit[4346]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4302 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:49.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064646161653366363662626334326561616533333733353433376562 Jan 28 01:20:49.449000 audit: BPF prog-id=173 op=LOAD Jan 28 01:20:49.449000 audit[4346]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4302 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:49.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064646161653366363662626334326561616533333733353433376562 Jan 28 01:20:49.449000 audit: BPF prog-id=173 op=UNLOAD Jan 28 01:20:49.449000 audit[4346]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4302 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:49.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064646161653366363662626334326561616533333733353433376562 Jan 28 01:20:49.449000 audit: BPF prog-id=172 op=UNLOAD Jan 28 01:20:49.449000 audit[4346]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4302 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:49.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064646161653366363662626334326561616533333733353433376562 Jan 28 01:20:49.449000 audit: BPF prog-id=174 op=LOAD Jan 28 01:20:49.449000 audit[4346]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4302 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:49.449000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064646161653366363662626334326561616533333733353433376562 Jan 28 01:20:49.465738 containerd[2557]: time="2026-01-28T01:20:49.465704703Z" level=info msg="StartContainer for \"0ddaae3f66bbc42eaae33735437ebf98cea99f296c9353ee7346d8544077985b\" returns successfully" Jan 28 01:20:49.883837 kubelet[4038]: I0128 01:20:49.883787 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-mrvns" podStartSLOduration=1.80882299 podStartE2EDuration="3.883669957s" podCreationTimestamp="2026-01-28 01:20:46 +0000 UTC" firstStartedPulling="2026-01-28 01:20:47.314095211 +0000 UTC m=+7.657713877" lastFinishedPulling="2026-01-28 01:20:49.388942185 +0000 UTC m=+9.732560844" observedRunningTime="2026-01-28 01:20:49.883435226 +0000 UTC m=+10.227053890" watchObservedRunningTime="2026-01-28 01:20:49.883669957 +0000 UTC m=+10.227288620" Jan 28 01:20:55.084886 sudo[3019]: pam_unix(sudo:session): session closed for user root Jan 28 01:20:55.092059 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 28 01:20:55.092137 kernel: audit: type=1106 audit(1769563255.084:539): pid=3019 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:55.084000 audit[3019]: USER_END pid=3019 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:55.084000 audit[3019]: CRED_DISP pid=3019 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:55.100039 kernel: audit: type=1104 audit(1769563255.084:540): pid=3019 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 01:20:55.191972 sshd[3018]: Connection closed by 10.200.16.10 port 43340 Jan 28 01:20:55.192661 sshd-session[3014]: pam_unix(sshd:session): session closed for user core Jan 28 01:20:55.193000 audit[3014]: USER_END pid=3014 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:55.198548 systemd[1]: sshd@6-10.200.8.20:22-10.200.16.10:43340.service: Deactivated successfully. Jan 28 01:20:55.201889 systemd[1]: session-10.scope: Deactivated successfully. Jan 28 01:20:55.201982 kernel: audit: type=1106 audit(1769563255.193:541): pid=3014 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:55.204769 systemd[1]: session-10.scope: Consumed 3.044s CPU time, 230.3M memory peak. Jan 28 01:20:55.193000 audit[3014]: CRED_DISP pid=3014 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:55.207144 systemd-logind[2536]: Session 10 logged out. Waiting for processes to exit. Jan 28 01:20:55.209394 systemd-logind[2536]: Removed session 10. Jan 28 01:20:55.212968 kernel: audit: type=1104 audit(1769563255.193:542): pid=3014 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:20:55.193000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.200.8.20:22-10.200.16.10:43340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:55.220965 kernel: audit: type=1131 audit(1769563255.193:543): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.200.8.20:22-10.200.16.10:43340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:20:56.578000 audit[4428]: NETFILTER_CFG table=filter:108 family=2 entries=15 op=nft_register_rule pid=4428 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:56.585150 kernel: audit: type=1325 audit(1769563256.578:544): table=filter:108 family=2 entries=15 op=nft_register_rule pid=4428 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:56.578000 audit[4428]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffdf0adcc10 a2=0 a3=7ffdf0adcbfc items=0 ppid=4145 pid=4428 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:56.595076 kernel: audit: type=1300 audit(1769563256.578:544): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffdf0adcc10 a2=0 a3=7ffdf0adcbfc items=0 ppid=4145 pid=4428 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:56.578000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:56.607623 kernel: audit: type=1327 audit(1769563256.578:544): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:56.607720 kernel: audit: type=1325 audit(1769563256.592:545): table=nat:109 family=2 entries=12 op=nft_register_rule pid=4428 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:56.592000 audit[4428]: NETFILTER_CFG table=nat:109 family=2 entries=12 op=nft_register_rule pid=4428 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:56.615664 kernel: audit: type=1300 audit(1769563256.592:545): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdf0adcc10 a2=0 a3=0 items=0 ppid=4145 pid=4428 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:56.592000 audit[4428]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdf0adcc10 a2=0 a3=0 items=0 ppid=4145 pid=4428 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:56.592000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:56.620000 audit[4430]: NETFILTER_CFG table=filter:110 family=2 entries=16 op=nft_register_rule pid=4430 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:56.620000 audit[4430]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fff3b46d270 a2=0 a3=7fff3b46d25c items=0 ppid=4145 pid=4430 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:56.620000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:56.624000 audit[4430]: NETFILTER_CFG table=nat:111 family=2 entries=12 op=nft_register_rule pid=4430 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:56.624000 audit[4430]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff3b46d270 a2=0 a3=0 items=0 ppid=4145 pid=4430 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:56.624000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:59.054000 audit[4432]: NETFILTER_CFG table=filter:112 family=2 entries=17 op=nft_register_rule pid=4432 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:59.054000 audit[4432]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7fff831c8100 a2=0 a3=7fff831c80ec items=0 ppid=4145 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:59.054000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:59.058000 audit[4432]: NETFILTER_CFG table=nat:113 family=2 entries=12 op=nft_register_rule pid=4432 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:59.058000 audit[4432]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff831c8100 a2=0 a3=0 items=0 ppid=4145 pid=4432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:59.058000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:59.070000 audit[4434]: NETFILTER_CFG table=filter:114 family=2 entries=18 op=nft_register_rule pid=4434 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:59.070000 audit[4434]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffe0b28ed60 a2=0 a3=7ffe0b28ed4c items=0 ppid=4145 pid=4434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:59.070000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:20:59.074000 audit[4434]: NETFILTER_CFG table=nat:115 family=2 entries=12 op=nft_register_rule pid=4434 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:20:59.074000 audit[4434]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe0b28ed60 a2=0 a3=0 items=0 ppid=4145 pid=4434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:20:59.074000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:00.083000 audit[4436]: NETFILTER_CFG table=filter:116 family=2 entries=19 op=nft_register_rule pid=4436 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:00.083000 audit[4436]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffeddd6f160 a2=0 a3=7ffeddd6f14c items=0 ppid=4145 pid=4436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:00.083000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:00.089000 audit[4436]: NETFILTER_CFG table=nat:117 family=2 entries=12 op=nft_register_rule pid=4436 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:00.091505 kernel: kauditd_printk_skb: 22 callbacks suppressed Jan 28 01:21:00.091544 kernel: audit: type=1325 audit(1769563260.089:553): table=nat:117 family=2 entries=12 op=nft_register_rule pid=4436 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:00.089000 audit[4436]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffeddd6f160 a2=0 a3=0 items=0 ppid=4145 pid=4436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:00.102692 kernel: audit: type=1300 audit(1769563260.089:553): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffeddd6f160 a2=0 a3=0 items=0 ppid=4145 pid=4436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:00.089000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:00.107078 kernel: audit: type=1327 audit(1769563260.089:553): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:00.693047 systemd[1]: Created slice kubepods-besteffort-pod4c1019c9_1796_4b79_a880_1ba209ffaeea.slice - libcontainer container kubepods-besteffort-pod4c1019c9_1796_4b79_a880_1ba209ffaeea.slice. Jan 28 01:21:00.771237 kubelet[4038]: I0128 01:21:00.771205 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c1019c9-1796-4b79-a880-1ba209ffaeea-tigera-ca-bundle\") pod \"calico-typha-5b64757669-l6mh2\" (UID: \"4c1019c9-1796-4b79-a880-1ba209ffaeea\") " pod="calico-system/calico-typha-5b64757669-l6mh2" Jan 28 01:21:00.771237 kubelet[4038]: I0128 01:21:00.771239 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf4t8\" (UniqueName: \"kubernetes.io/projected/4c1019c9-1796-4b79-a880-1ba209ffaeea-kube-api-access-mf4t8\") pod \"calico-typha-5b64757669-l6mh2\" (UID: \"4c1019c9-1796-4b79-a880-1ba209ffaeea\") " pod="calico-system/calico-typha-5b64757669-l6mh2" Jan 28 01:21:00.771572 kubelet[4038]: I0128 01:21:00.771255 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4c1019c9-1796-4b79-a880-1ba209ffaeea-typha-certs\") pod \"calico-typha-5b64757669-l6mh2\" (UID: \"4c1019c9-1796-4b79-a880-1ba209ffaeea\") " pod="calico-system/calico-typha-5b64757669-l6mh2" Jan 28 01:21:00.867539 systemd[1]: Created slice kubepods-besteffort-podd6b9c031_4b04_40c7_95b4_eb08bd321d8d.slice - libcontainer container kubepods-besteffort-podd6b9c031_4b04_40c7_95b4_eb08bd321d8d.slice. Jan 28 01:21:00.871620 kubelet[4038]: I0128 01:21:00.871534 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-lib-modules\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.871620 kubelet[4038]: I0128 01:21:00.871575 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-cni-log-dir\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872097 kubelet[4038]: I0128 01:21:00.871742 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-policysync\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872097 kubelet[4038]: I0128 01:21:00.871831 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-tigera-ca-bundle\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872097 kubelet[4038]: I0128 01:21:00.871850 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-cni-bin-dir\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872097 kubelet[4038]: I0128 01:21:00.872013 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-cni-net-dir\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872097 kubelet[4038]: I0128 01:21:00.872032 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-flexvol-driver-host\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872231 kubelet[4038]: I0128 01:21:00.872049 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-var-lib-calico\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872425 kubelet[4038]: I0128 01:21:00.872271 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2mp\" (UniqueName: \"kubernetes.io/projected/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-kube-api-access-dx2mp\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872425 kubelet[4038]: I0128 01:21:00.872313 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-node-certs\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872425 kubelet[4038]: I0128 01:21:00.872350 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-var-run-calico\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.872425 kubelet[4038]: I0128 01:21:00.872385 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d6b9c031-4b04-40c7-95b4-eb08bd321d8d-xtables-lock\") pod \"calico-node-r5dkf\" (UID: \"d6b9c031-4b04-40c7-95b4-eb08bd321d8d\") " pod="calico-system/calico-node-r5dkf" Jan 28 01:21:00.973647 kubelet[4038]: E0128 01:21:00.973559 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.974568 kubelet[4038]: W0128 01:21:00.973578 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.974568 kubelet[4038]: E0128 01:21:00.974268 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.975162 kubelet[4038]: E0128 01:21:00.975052 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.975162 kubelet[4038]: W0128 01:21:00.975065 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.975162 kubelet[4038]: E0128 01:21:00.975078 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.975617 kubelet[4038]: E0128 01:21:00.975594 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.975765 kubelet[4038]: W0128 01:21:00.975665 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.975765 kubelet[4038]: E0128 01:21:00.975677 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.976030 kubelet[4038]: E0128 01:21:00.976004 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.976030 kubelet[4038]: W0128 01:21:00.976012 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.976030 kubelet[4038]: E0128 01:21:00.976020 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.976309 kubelet[4038]: E0128 01:21:00.976266 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.976309 kubelet[4038]: W0128 01:21:00.976272 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.976309 kubelet[4038]: E0128 01:21:00.976279 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.976500 kubelet[4038]: E0128 01:21:00.976464 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.976500 kubelet[4038]: W0128 01:21:00.976469 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.976500 kubelet[4038]: E0128 01:21:00.976477 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.976651 kubelet[4038]: E0128 01:21:00.976640 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.976708 kubelet[4038]: W0128 01:21:00.976680 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.976708 kubelet[4038]: E0128 01:21:00.976687 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.976839 kubelet[4038]: E0128 01:21:00.976834 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.976884 kubelet[4038]: W0128 01:21:00.976868 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.976884 kubelet[4038]: E0128 01:21:00.976875 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.977062 kubelet[4038]: E0128 01:21:00.977028 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.977062 kubelet[4038]: W0128 01:21:00.977035 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.977062 kubelet[4038]: E0128 01:21:00.977041 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.977273 kubelet[4038]: E0128 01:21:00.977254 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.977273 kubelet[4038]: W0128 01:21:00.977260 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.977273 kubelet[4038]: E0128 01:21:00.977266 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.977479 kubelet[4038]: E0128 01:21:00.977457 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.977479 kubelet[4038]: W0128 01:21:00.977465 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.977479 kubelet[4038]: E0128 01:21:00.977472 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.977940 kubelet[4038]: E0128 01:21:00.977711 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.977940 kubelet[4038]: W0128 01:21:00.977733 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.977940 kubelet[4038]: E0128 01:21:00.977743 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.977940 kubelet[4038]: E0128 01:21:00.977854 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.977940 kubelet[4038]: W0128 01:21:00.977859 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.977940 kubelet[4038]: E0128 01:21:00.977866 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.977940 kubelet[4038]: E0128 01:21:00.977940 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.977940 kubelet[4038]: W0128 01:21:00.977943 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.978160 kubelet[4038]: E0128 01:21:00.977976 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.978160 kubelet[4038]: E0128 01:21:00.978052 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.978160 kubelet[4038]: W0128 01:21:00.978056 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.978160 kubelet[4038]: E0128 01:21:00.978062 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.978160 kubelet[4038]: E0128 01:21:00.978129 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.978160 kubelet[4038]: W0128 01:21:00.978132 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.978160 kubelet[4038]: E0128 01:21:00.978138 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.978302 kubelet[4038]: E0128 01:21:00.978236 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.978302 kubelet[4038]: W0128 01:21:00.978241 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.978302 kubelet[4038]: E0128 01:21:00.978246 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.978821 kubelet[4038]: E0128 01:21:00.978784 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.978821 kubelet[4038]: W0128 01:21:00.978796 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.978821 kubelet[4038]: E0128 01:21:00.978807 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.979310 kubelet[4038]: E0128 01:21:00.979278 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.979310 kubelet[4038]: W0128 01:21:00.979288 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.979310 kubelet[4038]: E0128 01:21:00.979298 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.979576 kubelet[4038]: E0128 01:21:00.979569 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.979662 kubelet[4038]: W0128 01:21:00.979626 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.979662 kubelet[4038]: E0128 01:21:00.979638 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.980014 kubelet[4038]: E0128 01:21:00.979894 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.980014 kubelet[4038]: W0128 01:21:00.979902 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.980014 kubelet[4038]: E0128 01:21:00.979911 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.980242 kubelet[4038]: E0128 01:21:00.980180 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.982129 kubelet[4038]: W0128 01:21:00.982065 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.982129 kubelet[4038]: E0128 01:21:00.982085 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.984618 kubelet[4038]: E0128 01:21:00.984479 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.984618 kubelet[4038]: W0128 01:21:00.984491 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.984618 kubelet[4038]: E0128 01:21:00.984502 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.984743 kubelet[4038]: E0128 01:21:00.984731 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.984743 kubelet[4038]: W0128 01:21:00.984740 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.984799 kubelet[4038]: E0128 01:21:00.984748 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.984890 kubelet[4038]: E0128 01:21:00.984876 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.984932 kubelet[4038]: W0128 01:21:00.984926 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.984995 kubelet[4038]: E0128 01:21:00.984972 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.985186 kubelet[4038]: E0128 01:21:00.985164 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.985186 kubelet[4038]: W0128 01:21:00.985171 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.985186 kubelet[4038]: E0128 01:21:00.985178 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.985757 kubelet[4038]: E0128 01:21:00.985383 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.985757 kubelet[4038]: W0128 01:21:00.985391 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.985757 kubelet[4038]: E0128 01:21:00.985399 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.986556 kubelet[4038]: E0128 01:21:00.986537 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.986616 kubelet[4038]: W0128 01:21:00.986562 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.986616 kubelet[4038]: E0128 01:21:00.986572 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.988173 kubelet[4038]: E0128 01:21:00.988158 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.988173 kubelet[4038]: W0128 01:21:00.988173 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.988237 kubelet[4038]: E0128 01:21:00.988184 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.988932 kubelet[4038]: E0128 01:21:00.988307 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.988932 kubelet[4038]: W0128 01:21:00.988314 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.988932 kubelet[4038]: E0128 01:21:00.988320 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.989065 kubelet[4038]: E0128 01:21:00.989012 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:00.989065 kubelet[4038]: W0128 01:21:00.989019 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:00.989065 kubelet[4038]: E0128 01:21:00.989031 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:00.997691 containerd[2557]: time="2026-01-28T01:21:00.997655193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b64757669-l6mh2,Uid:4c1019c9-1796-4b79-a880-1ba209ffaeea,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:01.057972 containerd[2557]: time="2026-01-28T01:21:01.057884800Z" level=info msg="connecting to shim e7b2e612ea6e6131712e4363f2b440e6b552b9cdff74f793ff18b9585c9af581" address="unix:///run/containerd/s/8b18c8c9ca8428ca15aecea85b0309ed4711c0d646f44cd83a97f8fd5d7269ea" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:01.082806 kubelet[4038]: E0128 01:21:01.082417 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:01.101385 systemd[1]: Started cri-containerd-e7b2e612ea6e6131712e4363f2b440e6b552b9cdff74f793ff18b9585c9af581.scope - libcontainer container e7b2e612ea6e6131712e4363f2b440e6b552b9cdff74f793ff18b9585c9af581. Jan 28 01:21:01.113000 audit[4523]: NETFILTER_CFG table=filter:118 family=2 entries=21 op=nft_register_rule pid=4523 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:01.119981 kernel: audit: type=1325 audit(1769563261.113:554): table=filter:118 family=2 entries=21 op=nft_register_rule pid=4523 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:01.113000 audit[4523]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe6618c4d0 a2=0 a3=7ffe6618c4bc items=0 ppid=4145 pid=4523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.130975 kernel: audit: type=1300 audit(1769563261.113:554): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe6618c4d0 a2=0 a3=7ffe6618c4bc items=0 ppid=4145 pid=4523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.113000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:01.141632 kernel: audit: type=1327 audit(1769563261.113:554): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:01.141679 kernel: audit: type=1325 audit(1769563261.121:555): table=nat:119 family=2 entries=12 op=nft_register_rule pid=4523 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:01.121000 audit[4523]: NETFILTER_CFG table=nat:119 family=2 entries=12 op=nft_register_rule pid=4523 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:01.121000 audit: BPF prog-id=175 op=LOAD Jan 28 01:21:01.151850 kernel: audit: type=1334 audit(1769563261.121:556): prog-id=175 op=LOAD Jan 28 01:21:01.151905 kernel: audit: type=1300 audit(1769563261.121:555): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe6618c4d0 a2=0 a3=0 items=0 ppid=4145 pid=4523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.121000 audit[4523]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe6618c4d0 a2=0 a3=0 items=0 ppid=4145 pid=4523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.157098 kernel: audit: type=1327 audit(1769563261.121:555): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:01.121000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:01.129000 audit: BPF prog-id=176 op=LOAD Jan 28 01:21:01.129000 audit[4495]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4481 pid=4495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537623265363132656136653631333137313265343336336632623434 Jan 28 01:21:01.129000 audit: BPF prog-id=176 op=UNLOAD Jan 28 01:21:01.129000 audit[4495]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537623265363132656136653631333137313265343336336632623434 Jan 28 01:21:01.129000 audit: BPF prog-id=177 op=LOAD Jan 28 01:21:01.129000 audit[4495]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4481 pid=4495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537623265363132656136653631333137313265343336336632623434 Jan 28 01:21:01.129000 audit: BPF prog-id=178 op=LOAD Jan 28 01:21:01.129000 audit[4495]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4481 pid=4495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537623265363132656136653631333137313265343336336632623434 Jan 28 01:21:01.129000 audit: BPF prog-id=178 op=UNLOAD Jan 28 01:21:01.129000 audit[4495]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537623265363132656136653631333137313265343336336632623434 Jan 28 01:21:01.129000 audit: BPF prog-id=177 op=UNLOAD Jan 28 01:21:01.129000 audit[4495]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537623265363132656136653631333137313265343336336632623434 Jan 28 01:21:01.129000 audit: BPF prog-id=179 op=LOAD Jan 28 01:21:01.129000 audit[4495]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4481 pid=4495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537623265363132656136653631333137313265343336336632623434 Jan 28 01:21:01.172798 containerd[2557]: time="2026-01-28T01:21:01.172746508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r5dkf,Uid:d6b9c031-4b04-40c7-95b4-eb08bd321d8d,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:01.174119 containerd[2557]: time="2026-01-28T01:21:01.174061190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b64757669-l6mh2,Uid:4c1019c9-1796-4b79-a880-1ba209ffaeea,Namespace:calico-system,Attempt:0,} returns sandbox id \"e7b2e612ea6e6131712e4363f2b440e6b552b9cdff74f793ff18b9585c9af581\"" Jan 28 01:21:01.174425 kubelet[4038]: E0128 01:21:01.174411 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.174496 kubelet[4038]: W0128 01:21:01.174426 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.174496 kubelet[4038]: E0128 01:21:01.174453 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.174868 kubelet[4038]: E0128 01:21:01.174567 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.174868 kubelet[4038]: W0128 01:21:01.174574 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.174868 kubelet[4038]: E0128 01:21:01.174580 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.174868 kubelet[4038]: E0128 01:21:01.174683 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.174868 kubelet[4038]: W0128 01:21:01.174687 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.174868 kubelet[4038]: E0128 01:21:01.174693 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.174868 kubelet[4038]: E0128 01:21:01.174809 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.174868 kubelet[4038]: W0128 01:21:01.174813 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.174868 kubelet[4038]: E0128 01:21:01.174829 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.175129 kubelet[4038]: E0128 01:21:01.174921 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.175129 kubelet[4038]: W0128 01:21:01.174926 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.175129 kubelet[4038]: E0128 01:21:01.174932 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.175195 kubelet[4038]: E0128 01:21:01.175163 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.175195 kubelet[4038]: W0128 01:21:01.175169 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.175195 kubelet[4038]: E0128 01:21:01.175176 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.175257 kubelet[4038]: E0128 01:21:01.175253 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.175277 kubelet[4038]: W0128 01:21:01.175257 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.175277 kubelet[4038]: E0128 01:21:01.175262 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.175909 kubelet[4038]: E0128 01:21:01.175330 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.175909 kubelet[4038]: W0128 01:21:01.175338 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.175909 kubelet[4038]: E0128 01:21:01.175342 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.175909 kubelet[4038]: E0128 01:21:01.175415 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.175909 kubelet[4038]: W0128 01:21:01.175419 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.175909 kubelet[4038]: E0128 01:21:01.175424 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.175909 kubelet[4038]: E0128 01:21:01.175489 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.175909 kubelet[4038]: W0128 01:21:01.175493 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.175909 kubelet[4038]: E0128 01:21:01.175497 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.175909 kubelet[4038]: E0128 01:21:01.175559 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.176181 kubelet[4038]: W0128 01:21:01.175563 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.176181 kubelet[4038]: E0128 01:21:01.175568 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.176181 kubelet[4038]: E0128 01:21:01.175633 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.176181 kubelet[4038]: W0128 01:21:01.175637 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.176181 kubelet[4038]: E0128 01:21:01.175641 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.176181 kubelet[4038]: E0128 01:21:01.175711 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.176181 kubelet[4038]: W0128 01:21:01.175715 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.176181 kubelet[4038]: E0128 01:21:01.175720 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.176181 kubelet[4038]: E0128 01:21:01.175779 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.176181 kubelet[4038]: W0128 01:21:01.175783 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.176372 kubelet[4038]: E0128 01:21:01.175788 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.176372 kubelet[4038]: E0128 01:21:01.175848 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.176372 kubelet[4038]: W0128 01:21:01.175852 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.176372 kubelet[4038]: E0128 01:21:01.175858 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.176372 kubelet[4038]: E0128 01:21:01.175920 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.176372 kubelet[4038]: W0128 01:21:01.175924 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.176372 kubelet[4038]: E0128 01:21:01.175929 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.176372 kubelet[4038]: E0128 01:21:01.176028 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.176372 kubelet[4038]: W0128 01:21:01.176032 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.176372 kubelet[4038]: E0128 01:21:01.176039 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.177208 kubelet[4038]: E0128 01:21:01.176103 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.177208 kubelet[4038]: W0128 01:21:01.176107 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.177208 kubelet[4038]: E0128 01:21:01.176112 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.177208 kubelet[4038]: E0128 01:21:01.176176 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.177208 kubelet[4038]: W0128 01:21:01.176179 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.177208 kubelet[4038]: E0128 01:21:01.176183 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.177208 kubelet[4038]: E0128 01:21:01.176248 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.177208 kubelet[4038]: W0128 01:21:01.176252 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.177208 kubelet[4038]: E0128 01:21:01.176257 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.177424 containerd[2557]: time="2026-01-28T01:21:01.176818687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 28 01:21:01.209274 containerd[2557]: time="2026-01-28T01:21:01.209247275Z" level=info msg="connecting to shim 73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a" address="unix:///run/containerd/s/63ec6920ffca3a0bf7ee154659da7664c79cd53c19c4ad6e9db3e182f181f497" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:01.228117 systemd[1]: Started cri-containerd-73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a.scope - libcontainer container 73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a. Jan 28 01:21:01.235000 audit: BPF prog-id=180 op=LOAD Jan 28 01:21:01.235000 audit: BPF prog-id=181 op=LOAD Jan 28 01:21:01.235000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4558 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.235000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733393136373132653766383430353933646266373565363163333166 Jan 28 01:21:01.235000 audit: BPF prog-id=181 op=UNLOAD Jan 28 01:21:01.235000 audit[4569]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.235000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733393136373132653766383430353933646266373565363163333166 Jan 28 01:21:01.235000 audit: BPF prog-id=182 op=LOAD Jan 28 01:21:01.235000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4558 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.235000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733393136373132653766383430353933646266373565363163333166 Jan 28 01:21:01.235000 audit: BPF prog-id=183 op=LOAD Jan 28 01:21:01.235000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4558 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.235000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733393136373132653766383430353933646266373565363163333166 Jan 28 01:21:01.235000 audit: BPF prog-id=183 op=UNLOAD Jan 28 01:21:01.235000 audit[4569]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.235000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733393136373132653766383430353933646266373565363163333166 Jan 28 01:21:01.235000 audit: BPF prog-id=182 op=UNLOAD Jan 28 01:21:01.235000 audit[4569]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.235000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733393136373132653766383430353933646266373565363163333166 Jan 28 01:21:01.236000 audit: BPF prog-id=184 op=LOAD Jan 28 01:21:01.236000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4558 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:01.236000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733393136373132653766383430353933646266373565363163333166 Jan 28 01:21:01.248489 containerd[2557]: time="2026-01-28T01:21:01.248466228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r5dkf,Uid:d6b9c031-4b04-40c7-95b4-eb08bd321d8d,Namespace:calico-system,Attempt:0,} returns sandbox id \"73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a\"" Jan 28 01:21:01.275642 kubelet[4038]: E0128 01:21:01.275623 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.275702 kubelet[4038]: W0128 01:21:01.275659 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.275702 kubelet[4038]: E0128 01:21:01.275676 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.275760 kubelet[4038]: I0128 01:21:01.275704 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74fe3431-17ca-4902-9eb5-64c3701d8bd6-kubelet-dir\") pod \"csi-node-driver-lcd4c\" (UID: \"74fe3431-17ca-4902-9eb5-64c3701d8bd6\") " pod="calico-system/csi-node-driver-lcd4c" Jan 28 01:21:01.275882 kubelet[4038]: E0128 01:21:01.275871 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.275908 kubelet[4038]: W0128 01:21:01.275881 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.275908 kubelet[4038]: E0128 01:21:01.275889 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.275973 kubelet[4038]: I0128 01:21:01.275909 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74fe3431-17ca-4902-9eb5-64c3701d8bd6-registration-dir\") pod \"csi-node-driver-lcd4c\" (UID: \"74fe3431-17ca-4902-9eb5-64c3701d8bd6\") " pod="calico-system/csi-node-driver-lcd4c" Jan 28 01:21:01.276071 kubelet[4038]: E0128 01:21:01.276054 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.276071 kubelet[4038]: W0128 01:21:01.276065 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.276138 kubelet[4038]: E0128 01:21:01.276075 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.276138 kubelet[4038]: I0128 01:21:01.276090 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74fe3431-17ca-4902-9eb5-64c3701d8bd6-socket-dir\") pod \"csi-node-driver-lcd4c\" (UID: \"74fe3431-17ca-4902-9eb5-64c3701d8bd6\") " pod="calico-system/csi-node-driver-lcd4c" Jan 28 01:21:01.276229 kubelet[4038]: E0128 01:21:01.276203 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.276229 kubelet[4038]: W0128 01:21:01.276225 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.276282 kubelet[4038]: E0128 01:21:01.276232 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.276386 kubelet[4038]: I0128 01:21:01.276331 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrkm\" (UniqueName: \"kubernetes.io/projected/74fe3431-17ca-4902-9eb5-64c3701d8bd6-kube-api-access-gkrkm\") pod \"csi-node-driver-lcd4c\" (UID: \"74fe3431-17ca-4902-9eb5-64c3701d8bd6\") " pod="calico-system/csi-node-driver-lcd4c" Jan 28 01:21:01.276386 kubelet[4038]: E0128 01:21:01.276348 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.276386 kubelet[4038]: W0128 01:21:01.276355 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.276386 kubelet[4038]: E0128 01:21:01.276362 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.276472 kubelet[4038]: E0128 01:21:01.276463 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.276490 kubelet[4038]: W0128 01:21:01.276475 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.276490 kubelet[4038]: E0128 01:21:01.276481 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.276613 kubelet[4038]: E0128 01:21:01.276602 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.276613 kubelet[4038]: W0128 01:21:01.276608 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.276657 kubelet[4038]: E0128 01:21:01.276614 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.276747 kubelet[4038]: E0128 01:21:01.276735 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.276747 kubelet[4038]: W0128 01:21:01.276744 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.276802 kubelet[4038]: E0128 01:21:01.276751 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.276868 kubelet[4038]: E0128 01:21:01.276859 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.276868 kubelet[4038]: W0128 01:21:01.276866 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.276915 kubelet[4038]: E0128 01:21:01.276872 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.276915 kubelet[4038]: I0128 01:21:01.276894 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/74fe3431-17ca-4902-9eb5-64c3701d8bd6-varrun\") pod \"csi-node-driver-lcd4c\" (UID: \"74fe3431-17ca-4902-9eb5-64c3701d8bd6\") " pod="calico-system/csi-node-driver-lcd4c" Jan 28 01:21:01.277044 kubelet[4038]: E0128 01:21:01.277022 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.277044 kubelet[4038]: W0128 01:21:01.277042 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.277108 kubelet[4038]: E0128 01:21:01.277048 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.277165 kubelet[4038]: E0128 01:21:01.277143 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.277165 kubelet[4038]: W0128 01:21:01.277163 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.277208 kubelet[4038]: E0128 01:21:01.277169 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.277335 kubelet[4038]: E0128 01:21:01.277277 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.277335 kubelet[4038]: W0128 01:21:01.277284 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.277335 kubelet[4038]: E0128 01:21:01.277290 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.277417 kubelet[4038]: E0128 01:21:01.277408 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.277417 kubelet[4038]: W0128 01:21:01.277415 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.277493 kubelet[4038]: E0128 01:21:01.277420 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.277537 kubelet[4038]: E0128 01:21:01.277529 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.277537 kubelet[4038]: W0128 01:21:01.277534 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.277594 kubelet[4038]: E0128 01:21:01.277540 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.277643 kubelet[4038]: E0128 01:21:01.277635 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.277643 kubelet[4038]: W0128 01:21:01.277641 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.277695 kubelet[4038]: E0128 01:21:01.277646 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.378506 kubelet[4038]: E0128 01:21:01.378485 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.378506 kubelet[4038]: W0128 01:21:01.378501 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.378718 kubelet[4038]: E0128 01:21:01.378515 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.378718 kubelet[4038]: E0128 01:21:01.378633 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.378718 kubelet[4038]: W0128 01:21:01.378638 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.378718 kubelet[4038]: E0128 01:21:01.378644 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.378882 kubelet[4038]: E0128 01:21:01.378774 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.378882 kubelet[4038]: W0128 01:21:01.378779 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.378882 kubelet[4038]: E0128 01:21:01.378786 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379067 kubelet[4038]: E0128 01:21:01.379036 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379067 kubelet[4038]: W0128 01:21:01.379056 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379170 kubelet[4038]: E0128 01:21:01.379069 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379170 kubelet[4038]: E0128 01:21:01.379167 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379249 kubelet[4038]: W0128 01:21:01.379172 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379249 kubelet[4038]: E0128 01:21:01.379178 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379308 kubelet[4038]: E0128 01:21:01.379262 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379308 kubelet[4038]: W0128 01:21:01.379266 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379308 kubelet[4038]: E0128 01:21:01.379272 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379408 kubelet[4038]: E0128 01:21:01.379398 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379408 kubelet[4038]: W0128 01:21:01.379403 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379464 kubelet[4038]: E0128 01:21:01.379409 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379522 kubelet[4038]: E0128 01:21:01.379511 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379522 kubelet[4038]: W0128 01:21:01.379518 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379584 kubelet[4038]: E0128 01:21:01.379524 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379633 kubelet[4038]: E0128 01:21:01.379599 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379633 kubelet[4038]: W0128 01:21:01.379603 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379633 kubelet[4038]: E0128 01:21:01.379608 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379716 kubelet[4038]: E0128 01:21:01.379681 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379716 kubelet[4038]: W0128 01:21:01.379687 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379716 kubelet[4038]: E0128 01:21:01.379693 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.379914 kubelet[4038]: E0128 01:21:01.379890 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.379914 kubelet[4038]: W0128 01:21:01.379912 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.379978 kubelet[4038]: E0128 01:21:01.379919 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.380070 kubelet[4038]: E0128 01:21:01.380061 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.380070 kubelet[4038]: W0128 01:21:01.380067 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.380126 kubelet[4038]: E0128 01:21:01.380073 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.380252 kubelet[4038]: E0128 01:21:01.380242 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.380252 kubelet[4038]: W0128 01:21:01.380250 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.380302 kubelet[4038]: E0128 01:21:01.380257 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.380449 kubelet[4038]: E0128 01:21:01.380438 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.380449 kubelet[4038]: W0128 01:21:01.380447 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.380510 kubelet[4038]: E0128 01:21:01.380454 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.380613 kubelet[4038]: E0128 01:21:01.380585 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.380613 kubelet[4038]: W0128 01:21:01.380605 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.380659 kubelet[4038]: E0128 01:21:01.380612 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.380732 kubelet[4038]: E0128 01:21:01.380712 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.380732 kubelet[4038]: W0128 01:21:01.380730 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.380797 kubelet[4038]: E0128 01:21:01.380736 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.380841 kubelet[4038]: E0128 01:21:01.380829 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.380841 kubelet[4038]: W0128 01:21:01.380833 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.380896 kubelet[4038]: E0128 01:21:01.380840 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381013 kubelet[4038]: E0128 01:21:01.380987 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381013 kubelet[4038]: W0128 01:21:01.381009 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381070 kubelet[4038]: E0128 01:21:01.381016 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381144 kubelet[4038]: E0128 01:21:01.381126 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381144 kubelet[4038]: W0128 01:21:01.381142 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381205 kubelet[4038]: E0128 01:21:01.381149 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381249 kubelet[4038]: E0128 01:21:01.381240 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381249 kubelet[4038]: W0128 01:21:01.381246 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381296 kubelet[4038]: E0128 01:21:01.381253 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381346 kubelet[4038]: E0128 01:21:01.381336 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381346 kubelet[4038]: W0128 01:21:01.381343 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381402 kubelet[4038]: E0128 01:21:01.381348 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381515 kubelet[4038]: E0128 01:21:01.381495 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381515 kubelet[4038]: W0128 01:21:01.381512 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381565 kubelet[4038]: E0128 01:21:01.381518 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381629 kubelet[4038]: E0128 01:21:01.381620 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381629 kubelet[4038]: W0128 01:21:01.381627 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381672 kubelet[4038]: E0128 01:21:01.381633 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381726 kubelet[4038]: E0128 01:21:01.381717 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381726 kubelet[4038]: W0128 01:21:01.381723 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381769 kubelet[4038]: E0128 01:21:01.381729 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.381908 kubelet[4038]: E0128 01:21:01.381889 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.381908 kubelet[4038]: W0128 01:21:01.381905 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.381945 kubelet[4038]: E0128 01:21:01.381911 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:01.388537 kubelet[4038]: E0128 01:21:01.388519 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:01.388537 kubelet[4038]: W0128 01:21:01.388537 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:01.388638 kubelet[4038]: E0128 01:21:01.388549 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:02.466062 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount241612909.mount: Deactivated successfully. Jan 28 01:21:02.740359 kubelet[4038]: E0128 01:21:02.740086 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:03.490724 containerd[2557]: time="2026-01-28T01:21:03.490674810Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:03.492885 containerd[2557]: time="2026-01-28T01:21:03.492845026Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 28 01:21:03.495030 containerd[2557]: time="2026-01-28T01:21:03.495002872Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:03.497893 containerd[2557]: time="2026-01-28T01:21:03.497853594Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:03.498203 containerd[2557]: time="2026-01-28T01:21:03.498184529Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.32133909s" Jan 28 01:21:03.498237 containerd[2557]: time="2026-01-28T01:21:03.498211605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 28 01:21:03.499042 containerd[2557]: time="2026-01-28T01:21:03.499019463Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 28 01:21:03.514378 containerd[2557]: time="2026-01-28T01:21:03.514349166Z" level=info msg="CreateContainer within sandbox \"e7b2e612ea6e6131712e4363f2b440e6b552b9cdff74f793ff18b9585c9af581\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 28 01:21:03.528226 containerd[2557]: time="2026-01-28T01:21:03.528196742Z" level=info msg="Container 57a61d57da33d5686d4dde42eda1f556eb06cf4e305cf91e92c871f8a3e2e98c: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:21:03.544359 containerd[2557]: time="2026-01-28T01:21:03.544336967Z" level=info msg="CreateContainer within sandbox \"e7b2e612ea6e6131712e4363f2b440e6b552b9cdff74f793ff18b9585c9af581\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"57a61d57da33d5686d4dde42eda1f556eb06cf4e305cf91e92c871f8a3e2e98c\"" Jan 28 01:21:03.544880 containerd[2557]: time="2026-01-28T01:21:03.544787190Z" level=info msg="StartContainer for \"57a61d57da33d5686d4dde42eda1f556eb06cf4e305cf91e92c871f8a3e2e98c\"" Jan 28 01:21:03.545977 containerd[2557]: time="2026-01-28T01:21:03.545927659Z" level=info msg="connecting to shim 57a61d57da33d5686d4dde42eda1f556eb06cf4e305cf91e92c871f8a3e2e98c" address="unix:///run/containerd/s/8b18c8c9ca8428ca15aecea85b0309ed4711c0d646f44cd83a97f8fd5d7269ea" protocol=ttrpc version=3 Jan 28 01:21:03.563103 systemd[1]: Started cri-containerd-57a61d57da33d5686d4dde42eda1f556eb06cf4e305cf91e92c871f8a3e2e98c.scope - libcontainer container 57a61d57da33d5686d4dde42eda1f556eb06cf4e305cf91e92c871f8a3e2e98c. Jan 28 01:21:03.572000 audit: BPF prog-id=185 op=LOAD Jan 28 01:21:03.573000 audit: BPF prog-id=186 op=LOAD Jan 28 01:21:03.573000 audit[4646]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4481 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:03.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537613631643537646133336435363836643464646534326564613166 Jan 28 01:21:03.573000 audit: BPF prog-id=186 op=UNLOAD Jan 28 01:21:03.573000 audit[4646]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:03.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537613631643537646133336435363836643464646534326564613166 Jan 28 01:21:03.573000 audit: BPF prog-id=187 op=LOAD Jan 28 01:21:03.573000 audit[4646]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4481 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:03.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537613631643537646133336435363836643464646534326564613166 Jan 28 01:21:03.573000 audit: BPF prog-id=188 op=LOAD Jan 28 01:21:03.573000 audit[4646]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4481 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:03.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537613631643537646133336435363836643464646534326564613166 Jan 28 01:21:03.573000 audit: BPF prog-id=188 op=UNLOAD Jan 28 01:21:03.573000 audit[4646]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:03.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537613631643537646133336435363836643464646534326564613166 Jan 28 01:21:03.573000 audit: BPF prog-id=187 op=UNLOAD Jan 28 01:21:03.573000 audit[4646]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4481 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:03.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537613631643537646133336435363836643464646534326564613166 Jan 28 01:21:03.573000 audit: BPF prog-id=189 op=LOAD Jan 28 01:21:03.573000 audit[4646]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4481 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:03.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537613631643537646133336435363836643464646534326564613166 Jan 28 01:21:03.602448 containerd[2557]: time="2026-01-28T01:21:03.602420583Z" level=info msg="StartContainer for \"57a61d57da33d5686d4dde42eda1f556eb06cf4e305cf91e92c871f8a3e2e98c\" returns successfully" Jan 28 01:21:03.918432 kubelet[4038]: I0128 01:21:03.918148 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5b64757669-l6mh2" podStartSLOduration=1.594912874 podStartE2EDuration="3.918131319s" podCreationTimestamp="2026-01-28 01:21:00 +0000 UTC" firstStartedPulling="2026-01-28 01:21:01.175689222 +0000 UTC m=+21.519307885" lastFinishedPulling="2026-01-28 01:21:03.498907664 +0000 UTC m=+23.842526330" observedRunningTime="2026-01-28 01:21:03.916574493 +0000 UTC m=+24.260193158" watchObservedRunningTime="2026-01-28 01:21:03.918131319 +0000 UTC m=+24.261749983" Jan 28 01:21:03.993585 kubelet[4038]: E0128 01:21:03.993536 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.993585 kubelet[4038]: W0128 01:21:03.993568 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.993856 kubelet[4038]: E0128 01:21:03.993724 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.993938 kubelet[4038]: E0128 01:21:03.993900 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.993938 kubelet[4038]: W0128 01:21:03.993907 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.993938 kubelet[4038]: E0128 01:21:03.993915 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.994163 kubelet[4038]: E0128 01:21:03.994124 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.994163 kubelet[4038]: W0128 01:21:03.994130 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.994163 kubelet[4038]: E0128 01:21:03.994137 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.994336 kubelet[4038]: E0128 01:21:03.994296 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.994336 kubelet[4038]: W0128 01:21:03.994301 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.994336 kubelet[4038]: E0128 01:21:03.994307 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.994521 kubelet[4038]: E0128 01:21:03.994487 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.994521 kubelet[4038]: W0128 01:21:03.994494 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.994521 kubelet[4038]: E0128 01:21:03.994501 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.994715 kubelet[4038]: E0128 01:21:03.994682 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.994715 kubelet[4038]: W0128 01:21:03.994689 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.994715 kubelet[4038]: E0128 01:21:03.994695 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.994894 kubelet[4038]: E0128 01:21:03.994856 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.994894 kubelet[4038]: W0128 01:21:03.994862 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.994894 kubelet[4038]: E0128 01:21:03.994870 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.995121 kubelet[4038]: E0128 01:21:03.995076 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.995121 kubelet[4038]: W0128 01:21:03.995083 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.995121 kubelet[4038]: E0128 01:21:03.995089 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.995305 kubelet[4038]: E0128 01:21:03.995270 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.995305 kubelet[4038]: W0128 01:21:03.995276 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.995305 kubelet[4038]: E0128 01:21:03.995285 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.995475 kubelet[4038]: E0128 01:21:03.995442 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.995475 kubelet[4038]: W0128 01:21:03.995449 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.995475 kubelet[4038]: E0128 01:21:03.995455 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.995702 kubelet[4038]: E0128 01:21:03.995643 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.995702 kubelet[4038]: W0128 01:21:03.995649 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.995702 kubelet[4038]: E0128 01:21:03.995656 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.995875 kubelet[4038]: E0128 01:21:03.995828 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.995875 kubelet[4038]: W0128 01:21:03.995835 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.995875 kubelet[4038]: E0128 01:21:03.995841 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.996076 kubelet[4038]: E0128 01:21:03.996042 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.996076 kubelet[4038]: W0128 01:21:03.996048 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.996076 kubelet[4038]: E0128 01:21:03.996055 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.996333 kubelet[4038]: E0128 01:21:03.996307 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.996333 kubelet[4038]: W0128 01:21:03.996330 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.996410 kubelet[4038]: E0128 01:21:03.996339 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.996498 kubelet[4038]: E0128 01:21:03.996492 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.996533 kubelet[4038]: W0128 01:21:03.996498 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.996533 kubelet[4038]: E0128 01:21:03.996505 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.996673 kubelet[4038]: E0128 01:21:03.996662 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.996673 kubelet[4038]: W0128 01:21:03.996671 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.996729 kubelet[4038]: E0128 01:21:03.996677 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.996832 kubelet[4038]: E0128 01:21:03.996823 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.996832 kubelet[4038]: W0128 01:21:03.996830 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.996887 kubelet[4038]: E0128 01:21:03.996836 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.996965 kubelet[4038]: E0128 01:21:03.996943 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.997001 kubelet[4038]: W0128 01:21:03.996970 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.997001 kubelet[4038]: E0128 01:21:03.996978 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.997123 kubelet[4038]: E0128 01:21:03.997100 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.997123 kubelet[4038]: W0128 01:21:03.997121 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.997176 kubelet[4038]: E0128 01:21:03.997127 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.997270 kubelet[4038]: E0128 01:21:03.997243 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.997270 kubelet[4038]: W0128 01:21:03.997265 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.997325 kubelet[4038]: E0128 01:21:03.997275 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.997435 kubelet[4038]: E0128 01:21:03.997416 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.997435 kubelet[4038]: W0128 01:21:03.997433 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.997475 kubelet[4038]: E0128 01:21:03.997439 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.997636 kubelet[4038]: E0128 01:21:03.997587 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.997636 kubelet[4038]: W0128 01:21:03.997596 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.997636 kubelet[4038]: E0128 01:21:03.997603 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.997768 kubelet[4038]: E0128 01:21:03.997757 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.997768 kubelet[4038]: W0128 01:21:03.997766 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.997810 kubelet[4038]: E0128 01:21:03.997774 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.997934 kubelet[4038]: E0128 01:21:03.997908 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.997934 kubelet[4038]: W0128 01:21:03.997930 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.998022 kubelet[4038]: E0128 01:21:03.997937 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.998110 kubelet[4038]: E0128 01:21:03.998084 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.998110 kubelet[4038]: W0128 01:21:03.998107 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.998173 kubelet[4038]: E0128 01:21:03.998114 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.998278 kubelet[4038]: E0128 01:21:03.998253 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.998278 kubelet[4038]: W0128 01:21:03.998274 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.998335 kubelet[4038]: E0128 01:21:03.998281 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.998401 kubelet[4038]: E0128 01:21:03.998391 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.998401 kubelet[4038]: W0128 01:21:03.998398 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.998453 kubelet[4038]: E0128 01:21:03.998404 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.998522 kubelet[4038]: E0128 01:21:03.998512 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.998522 kubelet[4038]: W0128 01:21:03.998519 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.998573 kubelet[4038]: E0128 01:21:03.998525 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.998655 kubelet[4038]: E0128 01:21:03.998645 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.998655 kubelet[4038]: W0128 01:21:03.998652 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.998703 kubelet[4038]: E0128 01:21:03.998659 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.998851 kubelet[4038]: E0128 01:21:03.998777 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.998851 kubelet[4038]: W0128 01:21:03.998784 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.998851 kubelet[4038]: E0128 01:21:03.998791 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.999003 kubelet[4038]: E0128 01:21:03.998996 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.999055 kubelet[4038]: W0128 01:21:03.999040 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.999084 kubelet[4038]: E0128 01:21:03.999052 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.999425 kubelet[4038]: E0128 01:21:03.999338 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.999425 kubelet[4038]: W0128 01:21:03.999349 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.999425 kubelet[4038]: E0128 01:21:03.999371 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:03.999560 kubelet[4038]: E0128 01:21:03.999492 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:03.999560 kubelet[4038]: W0128 01:21:03.999497 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:03.999560 kubelet[4038]: E0128 01:21:03.999504 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:04.740723 kubelet[4038]: E0128 01:21:04.740687 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:04.798000 audit[4727]: NETFILTER_CFG table=filter:120 family=2 entries=21 op=nft_register_rule pid=4727 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:04.798000 audit[4727]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffeb24e5590 a2=0 a3=7ffeb24e557c items=0 ppid=4145 pid=4727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:04.798000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:04.805000 audit[4727]: NETFILTER_CFG table=nat:121 family=2 entries=19 op=nft_register_chain pid=4727 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:04.805000 audit[4727]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffeb24e5590 a2=0 a3=7ffeb24e557c items=0 ppid=4145 pid=4727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:04.805000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:04.842433 containerd[2557]: time="2026-01-28T01:21:04.842397245Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:04.844622 containerd[2557]: time="2026-01-28T01:21:04.844487981Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:04.848831 containerd[2557]: time="2026-01-28T01:21:04.848805161Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:04.880444 containerd[2557]: time="2026-01-28T01:21:04.880400536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:04.880975 containerd[2557]: time="2026-01-28T01:21:04.880943551Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.381806623s" Jan 28 01:21:04.881013 containerd[2557]: time="2026-01-28T01:21:04.880982649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 28 01:21:04.886533 containerd[2557]: time="2026-01-28T01:21:04.886507232Z" level=info msg="CreateContainer within sandbox \"73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 28 01:21:04.902262 containerd[2557]: time="2026-01-28T01:21:04.902174174Z" level=info msg="Container 5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:21:04.931020 containerd[2557]: time="2026-01-28T01:21:04.930994927Z" level=info msg="CreateContainer within sandbox \"73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6\"" Jan 28 01:21:04.931973 containerd[2557]: time="2026-01-28T01:21:04.931457018Z" level=info msg="StartContainer for \"5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6\"" Jan 28 01:21:04.932902 containerd[2557]: time="2026-01-28T01:21:04.932867055Z" level=info msg="connecting to shim 5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6" address="unix:///run/containerd/s/63ec6920ffca3a0bf7ee154659da7664c79cd53c19c4ad6e9db3e182f181f497" protocol=ttrpc version=3 Jan 28 01:21:04.954167 systemd[1]: Started cri-containerd-5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6.scope - libcontainer container 5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6. Jan 28 01:21:04.986000 audit: BPF prog-id=190 op=LOAD Jan 28 01:21:04.986000 audit[4728]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4558 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:04.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561616661666330306437663462663535333336363366336364613339 Jan 28 01:21:04.986000 audit: BPF prog-id=191 op=LOAD Jan 28 01:21:04.986000 audit[4728]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4558 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:04.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561616661666330306437663462663535333336363366336364613339 Jan 28 01:21:04.986000 audit: BPF prog-id=191 op=UNLOAD Jan 28 01:21:04.986000 audit[4728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:04.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561616661666330306437663462663535333336363366336364613339 Jan 28 01:21:04.986000 audit: BPF prog-id=190 op=UNLOAD Jan 28 01:21:04.986000 audit[4728]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:04.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561616661666330306437663462663535333336363366336364613339 Jan 28 01:21:04.986000 audit: BPF prog-id=192 op=LOAD Jan 28 01:21:04.986000 audit[4728]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4558 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:04.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561616661666330306437663462663535333336363366336364613339 Jan 28 01:21:05.003476 kubelet[4038]: E0128 01:21:05.002660 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.003476 kubelet[4038]: W0128 01:21:05.002862 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.003476 kubelet[4038]: E0128 01:21:05.002895 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.004827 kubelet[4038]: E0128 01:21:05.004571 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.004827 kubelet[4038]: W0128 01:21:05.004585 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.004827 kubelet[4038]: E0128 01:21:05.004729 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.005416 kubelet[4038]: E0128 01:21:05.005264 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.005416 kubelet[4038]: W0128 01:21:05.005323 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.005416 kubelet[4038]: E0128 01:21:05.005337 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.006268 kubelet[4038]: E0128 01:21:05.006110 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.006268 kubelet[4038]: W0128 01:21:05.006148 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.006268 kubelet[4038]: E0128 01:21:05.006162 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.006535 kubelet[4038]: E0128 01:21:05.006484 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.006535 kubelet[4038]: W0128 01:21:05.006492 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.006535 kubelet[4038]: E0128 01:21:05.006503 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.006870 kubelet[4038]: E0128 01:21:05.006857 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.007013 kubelet[4038]: W0128 01:21:05.006969 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.007013 kubelet[4038]: E0128 01:21:05.006984 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.007740 kubelet[4038]: E0128 01:21:05.007465 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.007740 kubelet[4038]: W0128 01:21:05.007475 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.007740 kubelet[4038]: E0128 01:21:05.007487 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.007740 kubelet[4038]: E0128 01:21:05.007608 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.007740 kubelet[4038]: W0128 01:21:05.007613 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.007740 kubelet[4038]: E0128 01:21:05.007620 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.008022 kubelet[4038]: E0128 01:21:05.007907 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.008022 kubelet[4038]: W0128 01:21:05.007914 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.008022 kubelet[4038]: E0128 01:21:05.007921 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.008223 kubelet[4038]: E0128 01:21:05.008158 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.008223 kubelet[4038]: W0128 01:21:05.008166 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.008223 kubelet[4038]: E0128 01:21:05.008198 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.008381 kubelet[4038]: E0128 01:21:05.008375 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.008444 kubelet[4038]: W0128 01:21:05.008409 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.008444 kubelet[4038]: E0128 01:21:05.008418 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.008585 kubelet[4038]: E0128 01:21:05.008563 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.008585 kubelet[4038]: W0128 01:21:05.008569 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.008585 kubelet[4038]: E0128 01:21:05.008575 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.008840 kubelet[4038]: E0128 01:21:05.008797 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.008840 kubelet[4038]: W0128 01:21:05.008805 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.008840 kubelet[4038]: E0128 01:21:05.008812 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.009059 kubelet[4038]: E0128 01:21:05.009022 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.009059 kubelet[4038]: W0128 01:21:05.009029 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.009059 kubelet[4038]: E0128 01:21:05.009036 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.009137 containerd[2557]: time="2026-01-28T01:21:05.009055178Z" level=info msg="StartContainer for \"5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6\" returns successfully" Jan 28 01:21:05.009675 kubelet[4038]: E0128 01:21:05.009644 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.009675 kubelet[4038]: W0128 01:21:05.009656 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.009866 kubelet[4038]: E0128 01:21:05.009757 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.010232 kubelet[4038]: E0128 01:21:05.010166 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.010232 kubelet[4038]: W0128 01:21:05.010175 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.010232 kubelet[4038]: E0128 01:21:05.010185 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.010650 kubelet[4038]: E0128 01:21:05.010620 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.010650 kubelet[4038]: W0128 01:21:05.010630 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.010650 kubelet[4038]: E0128 01:21:05.010641 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.010971 kubelet[4038]: E0128 01:21:05.010921 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.010971 kubelet[4038]: W0128 01:21:05.010928 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.011090 kubelet[4038]: E0128 01:21:05.010936 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.011264 kubelet[4038]: E0128 01:21:05.011243 4038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 01:21:05.011356 kubelet[4038]: W0128 01:21:05.011306 4038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 01:21:05.011356 kubelet[4038]: E0128 01:21:05.011318 4038 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 01:21:05.013255 systemd[1]: cri-containerd-5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6.scope: Deactivated successfully. Jan 28 01:21:05.017348 containerd[2557]: time="2026-01-28T01:21:05.017325223Z" level=info msg="received container exit event container_id:\"5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6\" id:\"5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6\" pid:4741 exited_at:{seconds:1769563265 nanos:16882652}" Jan 28 01:21:05.017000 audit: BPF prog-id=192 op=UNLOAD Jan 28 01:21:05.036326 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5aafafc00d7f4bf5533663f3cda3928de03d95ba64ea36aa39bbf40a280cdbd6-rootfs.mount: Deactivated successfully. Jan 28 01:21:06.739804 kubelet[4038]: E0128 01:21:06.739756 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:07.917462 containerd[2557]: time="2026-01-28T01:21:07.917407903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 28 01:21:08.740841 kubelet[4038]: E0128 01:21:08.740781 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:10.740586 kubelet[4038]: E0128 01:21:10.740547 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:11.323806 containerd[2557]: time="2026-01-28T01:21:11.323769287Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:11.325810 containerd[2557]: time="2026-01-28T01:21:11.325776602Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70445002" Jan 28 01:21:11.328218 containerd[2557]: time="2026-01-28T01:21:11.328180694Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:11.330999 containerd[2557]: time="2026-01-28T01:21:11.330944085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:11.331534 containerd[2557]: time="2026-01-28T01:21:11.331241819Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.413782011s" Jan 28 01:21:11.331534 containerd[2557]: time="2026-01-28T01:21:11.331267051Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 28 01:21:11.339422 containerd[2557]: time="2026-01-28T01:21:11.339389284Z" level=info msg="CreateContainer within sandbox \"73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 28 01:21:11.355237 containerd[2557]: time="2026-01-28T01:21:11.354397235Z" level=info msg="Container 5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:21:11.369751 containerd[2557]: time="2026-01-28T01:21:11.369728007Z" level=info msg="CreateContainer within sandbox \"73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68\"" Jan 28 01:21:11.370416 containerd[2557]: time="2026-01-28T01:21:11.370111517Z" level=info msg="StartContainer for \"5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68\"" Jan 28 01:21:11.371351 containerd[2557]: time="2026-01-28T01:21:11.371322323Z" level=info msg="connecting to shim 5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68" address="unix:///run/containerd/s/63ec6920ffca3a0bf7ee154659da7664c79cd53c19c4ad6e9db3e182f181f497" protocol=ttrpc version=3 Jan 28 01:21:11.391138 systemd[1]: Started cri-containerd-5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68.scope - libcontainer container 5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68. Jan 28 01:21:11.431982 kernel: kauditd_printk_skb: 87 callbacks suppressed Jan 28 01:21:11.432056 kernel: audit: type=1334 audit(1769563271.428:588): prog-id=193 op=LOAD Jan 28 01:21:11.428000 audit: BPF prog-id=193 op=LOAD Jan 28 01:21:11.428000 audit[4809]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.438365 kernel: audit: type=1300 audit(1769563271.428:588): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.442264 kernel: audit: type=1327 audit(1769563271.428:588): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.428000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.444470 kernel: audit: type=1334 audit(1769563271.428:589): prog-id=194 op=LOAD Jan 28 01:21:11.428000 audit: BPF prog-id=194 op=LOAD Jan 28 01:21:11.447958 kernel: audit: type=1300 audit(1769563271.428:589): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.428000 audit[4809]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.428000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.455538 kernel: audit: type=1327 audit(1769563271.428:589): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.457572 kernel: audit: type=1334 audit(1769563271.428:590): prog-id=194 op=UNLOAD Jan 28 01:21:11.428000 audit: BPF prog-id=194 op=UNLOAD Jan 28 01:21:11.464735 kernel: audit: type=1300 audit(1769563271.428:590): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.428000 audit[4809]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.428000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.428000 audit: BPF prog-id=193 op=UNLOAD Jan 28 01:21:11.471086 kernel: audit: type=1327 audit(1769563271.428:590): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.471124 kernel: audit: type=1334 audit(1769563271.428:591): prog-id=193 op=UNLOAD Jan 28 01:21:11.428000 audit[4809]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.428000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.428000 audit: BPF prog-id=195 op=LOAD Jan 28 01:21:11.428000 audit[4809]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=4558 pid=4809 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:11.428000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566613730323464353566623538623038323732633665353930316235 Jan 28 01:21:11.478944 containerd[2557]: time="2026-01-28T01:21:11.478918851Z" level=info msg="StartContainer for \"5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68\" returns successfully" Jan 28 01:21:12.710611 systemd[1]: cri-containerd-5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68.scope: Deactivated successfully. Jan 28 01:21:12.710902 systemd[1]: cri-containerd-5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68.scope: Consumed 377ms CPU time, 190.3M memory peak, 171.3M written to disk. Jan 28 01:21:12.712320 containerd[2557]: time="2026-01-28T01:21:12.712203735Z" level=info msg="received container exit event container_id:\"5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68\" id:\"5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68\" pid:4822 exited_at:{seconds:1769563272 nanos:711850191}" Jan 28 01:21:12.716000 audit: BPF prog-id=195 op=UNLOAD Jan 28 01:21:12.731894 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5fa7024d55fb58b08272c6e5901b5b524928b61b87a718d0cdb3891907b26d68-rootfs.mount: Deactivated successfully. Jan 28 01:21:12.740551 kubelet[4038]: E0128 01:21:12.740518 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:12.748188 kubelet[4038]: I0128 01:21:12.747427 4038 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 28 01:21:12.959741 kubelet[4038]: I0128 01:21:12.959711 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlfcg\" (UniqueName: \"kubernetes.io/projected/61f82efe-84a3-4755-b723-be12f0bfe442-kube-api-access-nlfcg\") pod \"whisker-6fc7f886b-rpbrp\" (UID: \"61f82efe-84a3-4755-b723-be12f0bfe442\") " pod="calico-system/whisker-6fc7f886b-rpbrp" Jan 28 01:21:12.959868 kubelet[4038]: I0128 01:21:12.959828 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-ca-bundle\") pod \"whisker-6fc7f886b-rpbrp\" (UID: \"61f82efe-84a3-4755-b723-be12f0bfe442\") " pod="calico-system/whisker-6fc7f886b-rpbrp" Jan 28 01:21:12.959868 kubelet[4038]: I0128 01:21:12.959852 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-backend-key-pair\") pod \"whisker-6fc7f886b-rpbrp\" (UID: \"61f82efe-84a3-4755-b723-be12f0bfe442\") " pod="calico-system/whisker-6fc7f886b-rpbrp" Jan 28 01:21:13.074509 kubelet[4038]: E0128 01:21:13.061043 4038 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: object "calico-system"/"whisker-ca-bundle" not registered Jan 28 01:21:13.074509 kubelet[4038]: E0128 01:21:13.061120 4038 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-ca-bundle podName:61f82efe-84a3-4755-b723-be12f0bfe442 nodeName:}" failed. No retries permitted until 2026-01-28 01:21:13.561098927 +0000 UTC m=+33.904717585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-ca-bundle") pod "whisker-6fc7f886b-rpbrp" (UID: "61f82efe-84a3-4755-b723-be12f0bfe442") : object "calico-system"/"whisker-ca-bundle" not registered Jan 28 01:21:13.074509 kubelet[4038]: E0128 01:21:13.061123 4038 secret.go:189] Couldn't get secret calico-system/whisker-backend-key-pair: object "calico-system"/"whisker-backend-key-pair" not registered Jan 28 01:21:13.074509 kubelet[4038]: E0128 01:21:13.061158 4038 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-backend-key-pair podName:61f82efe-84a3-4755-b723-be12f0bfe442 nodeName:}" failed. No retries permitted until 2026-01-28 01:21:13.561150089 +0000 UTC m=+33.904768754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-backend-key-pair" (UniqueName: "kubernetes.io/secret/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-backend-key-pair") pod "whisker-6fc7f886b-rpbrp" (UID: "61f82efe-84a3-4755-b723-be12f0bfe442") : object "calico-system"/"whisker-backend-key-pair" not registered Jan 28 01:21:13.089243 systemd[1]: Created slice kubepods-besteffort-pod61f82efe_84a3_4755_b723_be12f0bfe442.slice - libcontainer container kubepods-besteffort-pod61f82efe_84a3_4755_b723_be12f0bfe442.slice. Jan 28 01:21:13.171922 kubelet[4038]: I0128 01:21:13.161444 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b2b158a-081b-4454-a96f-65445d9cadc6-tigera-ca-bundle\") pod \"calico-kube-controllers-5496f89df7-4vb68\" (UID: \"8b2b158a-081b-4454-a96f-65445d9cadc6\") " pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" Jan 28 01:21:13.171922 kubelet[4038]: I0128 01:21:13.161491 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wb2h\" (UniqueName: \"kubernetes.io/projected/8b2b158a-081b-4454-a96f-65445d9cadc6-kube-api-access-7wb2h\") pod \"calico-kube-controllers-5496f89df7-4vb68\" (UID: \"8b2b158a-081b-4454-a96f-65445d9cadc6\") " pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" Jan 28 01:21:13.225765 systemd[1]: Created slice kubepods-burstable-pode5b3670b_e45e_4499_a525_031c765e8a68.slice - libcontainer container kubepods-burstable-pode5b3670b_e45e_4499_a525_031c765e8a68.slice. Jan 28 01:21:13.262467 kubelet[4038]: I0128 01:21:13.262432 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twds4\" (UniqueName: \"kubernetes.io/projected/e5b3670b-e45e-4499-a525-031c765e8a68-kube-api-access-twds4\") pod \"coredns-674b8bbfcf-p46qn\" (UID: \"e5b3670b-e45e-4499-a525-031c765e8a68\") " pod="kube-system/coredns-674b8bbfcf-p46qn" Jan 28 01:21:13.262904 kubelet[4038]: I0128 01:21:13.262503 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5b3670b-e45e-4499-a525-031c765e8a68-config-volume\") pod \"coredns-674b8bbfcf-p46qn\" (UID: \"e5b3670b-e45e-4499-a525-031c765e8a68\") " pod="kube-system/coredns-674b8bbfcf-p46qn" Jan 28 01:21:13.528416 containerd[2557]: time="2026-01-28T01:21:13.528371185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-p46qn,Uid:e5b3670b-e45e-4499-a525-031c765e8a68,Namespace:kube-system,Attempt:0,}" Jan 28 01:21:13.574505 systemd[1]: Created slice kubepods-besteffort-pod8b2b158a_081b_4454_a96f_65445d9cadc6.slice - libcontainer container kubepods-besteffort-pod8b2b158a_081b_4454_a96f_65445d9cadc6.slice. Jan 28 01:21:13.576488 containerd[2557]: time="2026-01-28T01:21:13.576444152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5496f89df7-4vb68,Uid:8b2b158a-081b-4454-a96f-65445d9cadc6,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:13.637197 systemd[1]: Created slice kubepods-besteffort-pod4b861133_0274_4274_bab9_748410e42edc.slice - libcontainer container kubepods-besteffort-pod4b861133_0274_4274_bab9_748410e42edc.slice. Jan 28 01:21:13.644786 systemd[1]: Created slice kubepods-besteffort-pod489aa6ff_974c_4c0f_ad71_b359b70146bf.slice - libcontainer container kubepods-besteffort-pod489aa6ff_974c_4c0f_ad71_b359b70146bf.slice. Jan 28 01:21:13.659660 systemd[1]: Created slice kubepods-burstable-podd30a5acc_464e_4af2_9051_87ebe5ea8e81.slice - libcontainer container kubepods-burstable-podd30a5acc_464e_4af2_9051_87ebe5ea8e81.slice. Jan 28 01:21:13.664428 kubelet[4038]: I0128 01:21:13.664392 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d68b64e3-e019-4732-8971-c8457279d8f6-config\") pod \"goldmane-666569f655-kz629\" (UID: \"d68b64e3-e019-4732-8971-c8457279d8f6\") " pod="calico-system/goldmane-666569f655-kz629" Jan 28 01:21:13.664514 kubelet[4038]: I0128 01:21:13.664432 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjlfl\" (UniqueName: \"kubernetes.io/projected/4b861133-0274-4274-bab9-748410e42edc-kube-api-access-rjlfl\") pod \"calico-apiserver-db6789d8-bg2b5\" (UID: \"4b861133-0274-4274-bab9-748410e42edc\") " pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" Jan 28 01:21:13.664514 kubelet[4038]: I0128 01:21:13.664449 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/d68b64e3-e019-4732-8971-c8457279d8f6-goldmane-key-pair\") pod \"goldmane-666569f655-kz629\" (UID: \"d68b64e3-e019-4732-8971-c8457279d8f6\") " pod="calico-system/goldmane-666569f655-kz629" Jan 28 01:21:13.664514 kubelet[4038]: I0128 01:21:13.664469 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d30a5acc-464e-4af2-9051-87ebe5ea8e81-config-volume\") pod \"coredns-674b8bbfcf-mb789\" (UID: \"d30a5acc-464e-4af2-9051-87ebe5ea8e81\") " pod="kube-system/coredns-674b8bbfcf-mb789" Jan 28 01:21:13.664514 kubelet[4038]: I0128 01:21:13.664485 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/489aa6ff-974c-4c0f-ad71-b359b70146bf-calico-apiserver-certs\") pod \"calico-apiserver-db6789d8-85rm2\" (UID: \"489aa6ff-974c-4c0f-ad71-b359b70146bf\") " pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" Jan 28 01:21:13.664514 kubelet[4038]: I0128 01:21:13.664502 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvc2c\" (UniqueName: \"kubernetes.io/projected/d68b64e3-e019-4732-8971-c8457279d8f6-kube-api-access-qvc2c\") pod \"goldmane-666569f655-kz629\" (UID: \"d68b64e3-e019-4732-8971-c8457279d8f6\") " pod="calico-system/goldmane-666569f655-kz629" Jan 28 01:21:13.664625 kubelet[4038]: I0128 01:21:13.664531 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55wb\" (UniqueName: \"kubernetes.io/projected/d30a5acc-464e-4af2-9051-87ebe5ea8e81-kube-api-access-r55wb\") pod \"coredns-674b8bbfcf-mb789\" (UID: \"d30a5acc-464e-4af2-9051-87ebe5ea8e81\") " pod="kube-system/coredns-674b8bbfcf-mb789" Jan 28 01:21:13.664625 kubelet[4038]: I0128 01:21:13.664551 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkxk\" (UniqueName: \"kubernetes.io/projected/489aa6ff-974c-4c0f-ad71-b359b70146bf-kube-api-access-2dkxk\") pod \"calico-apiserver-db6789d8-85rm2\" (UID: \"489aa6ff-974c-4c0f-ad71-b359b70146bf\") " pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" Jan 28 01:21:13.664625 kubelet[4038]: I0128 01:21:13.664568 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4b861133-0274-4274-bab9-748410e42edc-calico-apiserver-certs\") pod \"calico-apiserver-db6789d8-bg2b5\" (UID: \"4b861133-0274-4274-bab9-748410e42edc\") " pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" Jan 28 01:21:13.664625 kubelet[4038]: I0128 01:21:13.664596 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d68b64e3-e019-4732-8971-c8457279d8f6-goldmane-ca-bundle\") pod \"goldmane-666569f655-kz629\" (UID: \"d68b64e3-e019-4732-8971-c8457279d8f6\") " pod="calico-system/goldmane-666569f655-kz629" Jan 28 01:21:13.670394 systemd[1]: Created slice kubepods-besteffort-podd68b64e3_e019_4732_8971_c8457279d8f6.slice - libcontainer container kubepods-besteffort-podd68b64e3_e019_4732_8971_c8457279d8f6.slice. Jan 28 01:21:13.691754 containerd[2557]: time="2026-01-28T01:21:13.691726239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fc7f886b-rpbrp,Uid:61f82efe-84a3-4755-b723-be12f0bfe442,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:13.750248 containerd[2557]: time="2026-01-28T01:21:13.750190638Z" level=error msg="Failed to destroy network for sandbox \"90d996a3d36f3c38da87e233b2c335ee080aa5ff8562538cb1c56c107d07949b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.753176 systemd[1]: run-netns-cni\x2d17f0f050\x2df2bf\x2d25aa\x2d01d8\x2d38c54c6ebb3b.mount: Deactivated successfully. Jan 28 01:21:13.755515 containerd[2557]: time="2026-01-28T01:21:13.755482202Z" level=error msg="Failed to destroy network for sandbox \"cad9ff8355560fdd2a5932303e7d6ad5ec3a4c41fa2ee024df02bb11104e35f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.756851 containerd[2557]: time="2026-01-28T01:21:13.756810114Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-p46qn,Uid:e5b3670b-e45e-4499-a525-031c765e8a68,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"90d996a3d36f3c38da87e233b2c335ee080aa5ff8562538cb1c56c107d07949b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.757540 systemd[1]: run-netns-cni\x2d2dd9975f\x2d8bf5\x2d5965\x2d18ff\x2dc61ae02f7873.mount: Deactivated successfully. Jan 28 01:21:13.758093 kubelet[4038]: E0128 01:21:13.757911 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90d996a3d36f3c38da87e233b2c335ee080aa5ff8562538cb1c56c107d07949b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.758093 kubelet[4038]: E0128 01:21:13.758008 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90d996a3d36f3c38da87e233b2c335ee080aa5ff8562538cb1c56c107d07949b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-p46qn" Jan 28 01:21:13.758093 kubelet[4038]: E0128 01:21:13.758028 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90d996a3d36f3c38da87e233b2c335ee080aa5ff8562538cb1c56c107d07949b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-p46qn" Jan 28 01:21:13.758503 kubelet[4038]: E0128 01:21:13.758397 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-p46qn_kube-system(e5b3670b-e45e-4499-a525-031c765e8a68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-p46qn_kube-system(e5b3670b-e45e-4499-a525-031c765e8a68)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"90d996a3d36f3c38da87e233b2c335ee080aa5ff8562538cb1c56c107d07949b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-p46qn" podUID="e5b3670b-e45e-4499-a525-031c765e8a68" Jan 28 01:21:13.763428 containerd[2557]: time="2026-01-28T01:21:13.763396559Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5496f89df7-4vb68,Uid:8b2b158a-081b-4454-a96f-65445d9cadc6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cad9ff8355560fdd2a5932303e7d6ad5ec3a4c41fa2ee024df02bb11104e35f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.763974 kubelet[4038]: E0128 01:21:13.763933 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cad9ff8355560fdd2a5932303e7d6ad5ec3a4c41fa2ee024df02bb11104e35f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.764063 kubelet[4038]: E0128 01:21:13.763987 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cad9ff8355560fdd2a5932303e7d6ad5ec3a4c41fa2ee024df02bb11104e35f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" Jan 28 01:21:13.764063 kubelet[4038]: E0128 01:21:13.764003 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cad9ff8355560fdd2a5932303e7d6ad5ec3a4c41fa2ee024df02bb11104e35f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" Jan 28 01:21:13.764063 kubelet[4038]: E0128 01:21:13.764041 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5496f89df7-4vb68_calico-system(8b2b158a-081b-4454-a96f-65445d9cadc6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5496f89df7-4vb68_calico-system(8b2b158a-081b-4454-a96f-65445d9cadc6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cad9ff8355560fdd2a5932303e7d6ad5ec3a4c41fa2ee024df02bb11104e35f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:21:13.765971 containerd[2557]: time="2026-01-28T01:21:13.765913524Z" level=error msg="Failed to destroy network for sandbox \"611248ad5a5e138debe5405773232c2e9f1c67aba2031921229a5ab0fb3cf838\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.770889 systemd[1]: run-netns-cni\x2d974e509b\x2d3a57\x2deb74\x2da9ac\x2d1a5f163a78a2.mount: Deactivated successfully. Jan 28 01:21:13.773568 containerd[2557]: time="2026-01-28T01:21:13.773494682Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fc7f886b-rpbrp,Uid:61f82efe-84a3-4755-b723-be12f0bfe442,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"611248ad5a5e138debe5405773232c2e9f1c67aba2031921229a5ab0fb3cf838\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.778242 kubelet[4038]: E0128 01:21:13.778216 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"611248ad5a5e138debe5405773232c2e9f1c67aba2031921229a5ab0fb3cf838\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:13.778308 kubelet[4038]: E0128 01:21:13.778254 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"611248ad5a5e138debe5405773232c2e9f1c67aba2031921229a5ab0fb3cf838\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6fc7f886b-rpbrp" Jan 28 01:21:13.778308 kubelet[4038]: E0128 01:21:13.778270 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"611248ad5a5e138debe5405773232c2e9f1c67aba2031921229a5ab0fb3cf838\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6fc7f886b-rpbrp" Jan 28 01:21:13.778358 kubelet[4038]: E0128 01:21:13.778306 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6fc7f886b-rpbrp_calico-system(61f82efe-84a3-4755-b723-be12f0bfe442)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6fc7f886b-rpbrp_calico-system(61f82efe-84a3-4755-b723-be12f0bfe442)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"611248ad5a5e138debe5405773232c2e9f1c67aba2031921229a5ab0fb3cf838\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6fc7f886b-rpbrp" podUID="61f82efe-84a3-4755-b723-be12f0bfe442" Jan 28 01:21:13.932139 containerd[2557]: time="2026-01-28T01:21:13.932104882Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 28 01:21:13.943438 containerd[2557]: time="2026-01-28T01:21:13.943407372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-bg2b5,Uid:4b861133-0274-4274-bab9-748410e42edc,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:21:13.950393 containerd[2557]: time="2026-01-28T01:21:13.950364209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-85rm2,Uid:489aa6ff-974c-4c0f-ad71-b359b70146bf,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:21:13.967193 containerd[2557]: time="2026-01-28T01:21:13.967067642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mb789,Uid:d30a5acc-464e-4af2-9051-87ebe5ea8e81,Namespace:kube-system,Attempt:0,}" Jan 28 01:21:13.979590 containerd[2557]: time="2026-01-28T01:21:13.979568293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-kz629,Uid:d68b64e3-e019-4732-8971-c8457279d8f6,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:14.020080 containerd[2557]: time="2026-01-28T01:21:14.020048687Z" level=error msg="Failed to destroy network for sandbox \"39b2faf1949ab6fad1f081e7340b1d726eed55c63384993c6bc91f71a90c76db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.024759 containerd[2557]: time="2026-01-28T01:21:14.024713464Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-85rm2,Uid:489aa6ff-974c-4c0f-ad71-b359b70146bf,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"39b2faf1949ab6fad1f081e7340b1d726eed55c63384993c6bc91f71a90c76db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.025531 kubelet[4038]: E0128 01:21:14.024945 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39b2faf1949ab6fad1f081e7340b1d726eed55c63384993c6bc91f71a90c76db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.025601 kubelet[4038]: E0128 01:21:14.025559 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39b2faf1949ab6fad1f081e7340b1d726eed55c63384993c6bc91f71a90c76db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" Jan 28 01:21:14.025601 kubelet[4038]: E0128 01:21:14.025590 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39b2faf1949ab6fad1f081e7340b1d726eed55c63384993c6bc91f71a90c76db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" Jan 28 01:21:14.026746 kubelet[4038]: E0128 01:21:14.025873 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-db6789d8-85rm2_calico-apiserver(489aa6ff-974c-4c0f-ad71-b359b70146bf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-db6789d8-85rm2_calico-apiserver(489aa6ff-974c-4c0f-ad71-b359b70146bf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"39b2faf1949ab6fad1f081e7340b1d726eed55c63384993c6bc91f71a90c76db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:21:14.029347 containerd[2557]: time="2026-01-28T01:21:14.029313191Z" level=error msg="Failed to destroy network for sandbox \"b75a1b106ec9becd72bd3109f9f3378ba8b21774aef8e597e0bfad805cc2312c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.037198 containerd[2557]: time="2026-01-28T01:21:14.037165730Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-bg2b5,Uid:4b861133-0274-4274-bab9-748410e42edc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b75a1b106ec9becd72bd3109f9f3378ba8b21774aef8e597e0bfad805cc2312c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.037681 kubelet[4038]: E0128 01:21:14.037637 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b75a1b106ec9becd72bd3109f9f3378ba8b21774aef8e597e0bfad805cc2312c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.037741 kubelet[4038]: E0128 01:21:14.037697 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b75a1b106ec9becd72bd3109f9f3378ba8b21774aef8e597e0bfad805cc2312c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" Jan 28 01:21:14.037741 kubelet[4038]: E0128 01:21:14.037715 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b75a1b106ec9becd72bd3109f9f3378ba8b21774aef8e597e0bfad805cc2312c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" Jan 28 01:21:14.037786 kubelet[4038]: E0128 01:21:14.037769 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-db6789d8-bg2b5_calico-apiserver(4b861133-0274-4274-bab9-748410e42edc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-db6789d8-bg2b5_calico-apiserver(4b861133-0274-4274-bab9-748410e42edc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b75a1b106ec9becd72bd3109f9f3378ba8b21774aef8e597e0bfad805cc2312c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:21:14.058553 containerd[2557]: time="2026-01-28T01:21:14.058516680Z" level=error msg="Failed to destroy network for sandbox \"7066d0fcb05a06a1fbc54447502e0cae292f822afcf887a58532907ad40d2cbf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.063457 containerd[2557]: time="2026-01-28T01:21:14.063422077Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mb789,Uid:d30a5acc-464e-4af2-9051-87ebe5ea8e81,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7066d0fcb05a06a1fbc54447502e0cae292f822afcf887a58532907ad40d2cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.063653 kubelet[4038]: E0128 01:21:14.063631 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7066d0fcb05a06a1fbc54447502e0cae292f822afcf887a58532907ad40d2cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.063738 kubelet[4038]: E0128 01:21:14.063727 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7066d0fcb05a06a1fbc54447502e0cae292f822afcf887a58532907ad40d2cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mb789" Jan 28 01:21:14.063787 kubelet[4038]: E0128 01:21:14.063777 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7066d0fcb05a06a1fbc54447502e0cae292f822afcf887a58532907ad40d2cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mb789" Jan 28 01:21:14.063873 kubelet[4038]: E0128 01:21:14.063854 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-mb789_kube-system(d30a5acc-464e-4af2-9051-87ebe5ea8e81)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-mb789_kube-system(d30a5acc-464e-4af2-9051-87ebe5ea8e81)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7066d0fcb05a06a1fbc54447502e0cae292f822afcf887a58532907ad40d2cbf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-mb789" podUID="d30a5acc-464e-4af2-9051-87ebe5ea8e81" Jan 28 01:21:14.065165 containerd[2557]: time="2026-01-28T01:21:14.065124154Z" level=error msg="Failed to destroy network for sandbox \"5f95bfa26c23975428decd8b08d0541e8d20ba44341b78b7dc10ae10997137ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.069503 containerd[2557]: time="2026-01-28T01:21:14.069471455Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-kz629,Uid:d68b64e3-e019-4732-8971-c8457279d8f6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f95bfa26c23975428decd8b08d0541e8d20ba44341b78b7dc10ae10997137ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.069620 kubelet[4038]: E0128 01:21:14.069595 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f95bfa26c23975428decd8b08d0541e8d20ba44341b78b7dc10ae10997137ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.069659 kubelet[4038]: E0128 01:21:14.069631 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f95bfa26c23975428decd8b08d0541e8d20ba44341b78b7dc10ae10997137ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-kz629" Jan 28 01:21:14.069659 kubelet[4038]: E0128 01:21:14.069646 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f95bfa26c23975428decd8b08d0541e8d20ba44341b78b7dc10ae10997137ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-kz629" Jan 28 01:21:14.069721 kubelet[4038]: E0128 01:21:14.069690 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-kz629_calico-system(d68b64e3-e019-4732-8971-c8457279d8f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-kz629_calico-system(d68b64e3-e019-4732-8971-c8457279d8f6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f95bfa26c23975428decd8b08d0541e8d20ba44341b78b7dc10ae10997137ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:21:14.744297 systemd[1]: Created slice kubepods-besteffort-pod74fe3431_17ca_4902_9eb5_64c3701d8bd6.slice - libcontainer container kubepods-besteffort-pod74fe3431_17ca_4902_9eb5_64c3701d8bd6.slice. Jan 28 01:21:14.746114 containerd[2557]: time="2026-01-28T01:21:14.746076704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lcd4c,Uid:74fe3431-17ca-4902-9eb5-64c3701d8bd6,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:14.788386 containerd[2557]: time="2026-01-28T01:21:14.788352800Z" level=error msg="Failed to destroy network for sandbox \"8417e3151fe2225ff8f93d2697644443e1b1e89bcd3ce56873ad5a1e74f9a63f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.790335 systemd[1]: run-netns-cni\x2d103ee9b6\x2d4d1d\x2d25db\x2d8dad\x2d98805b6f63d1.mount: Deactivated successfully. Jan 28 01:21:14.798396 containerd[2557]: time="2026-01-28T01:21:14.798363984Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lcd4c,Uid:74fe3431-17ca-4902-9eb5-64c3701d8bd6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8417e3151fe2225ff8f93d2697644443e1b1e89bcd3ce56873ad5a1e74f9a63f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.798784 kubelet[4038]: E0128 01:21:14.798534 4038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8417e3151fe2225ff8f93d2697644443e1b1e89bcd3ce56873ad5a1e74f9a63f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 01:21:14.798784 kubelet[4038]: E0128 01:21:14.798587 4038 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8417e3151fe2225ff8f93d2697644443e1b1e89bcd3ce56873ad5a1e74f9a63f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lcd4c" Jan 28 01:21:14.798784 kubelet[4038]: E0128 01:21:14.798604 4038 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8417e3151fe2225ff8f93d2697644443e1b1e89bcd3ce56873ad5a1e74f9a63f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lcd4c" Jan 28 01:21:14.799098 kubelet[4038]: E0128 01:21:14.798662 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8417e3151fe2225ff8f93d2697644443e1b1e89bcd3ce56873ad5a1e74f9a63f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:19.689155 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount473545757.mount: Deactivated successfully. Jan 28 01:21:19.720158 containerd[2557]: time="2026-01-28T01:21:19.720117167Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:19.722242 containerd[2557]: time="2026-01-28T01:21:19.722161812Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 28 01:21:19.727068 containerd[2557]: time="2026-01-28T01:21:19.727046378Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:19.729961 containerd[2557]: time="2026-01-28T01:21:19.729902794Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 01:21:19.730306 containerd[2557]: time="2026-01-28T01:21:19.730190162Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 5.798043134s" Jan 28 01:21:19.730306 containerd[2557]: time="2026-01-28T01:21:19.730220574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 28 01:21:19.746238 containerd[2557]: time="2026-01-28T01:21:19.746210404Z" level=info msg="CreateContainer within sandbox \"73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 28 01:21:19.767482 containerd[2557]: time="2026-01-28T01:21:19.766051232Z" level=info msg="Container fb96e509ad41ab97a690a97440aee5cc822cbf01be40bada1cadb6ee77d79ac4: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:21:19.769744 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2082716877.mount: Deactivated successfully. Jan 28 01:21:19.780989 containerd[2557]: time="2026-01-28T01:21:19.780963945Z" level=info msg="CreateContainer within sandbox \"73916712e7f840593dbf75e61c31f7322349426dc9fdd0a0ea1fea41e070d92a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fb96e509ad41ab97a690a97440aee5cc822cbf01be40bada1cadb6ee77d79ac4\"" Jan 28 01:21:19.781395 containerd[2557]: time="2026-01-28T01:21:19.781371590Z" level=info msg="StartContainer for \"fb96e509ad41ab97a690a97440aee5cc822cbf01be40bada1cadb6ee77d79ac4\"" Jan 28 01:21:19.782732 containerd[2557]: time="2026-01-28T01:21:19.782708703Z" level=info msg="connecting to shim fb96e509ad41ab97a690a97440aee5cc822cbf01be40bada1cadb6ee77d79ac4" address="unix:///run/containerd/s/63ec6920ffca3a0bf7ee154659da7664c79cd53c19c4ad6e9db3e182f181f497" protocol=ttrpc version=3 Jan 28 01:21:19.800124 systemd[1]: Started cri-containerd-fb96e509ad41ab97a690a97440aee5cc822cbf01be40bada1cadb6ee77d79ac4.scope - libcontainer container fb96e509ad41ab97a690a97440aee5cc822cbf01be40bada1cadb6ee77d79ac4. Jan 28 01:21:19.840000 audit: BPF prog-id=196 op=LOAD Jan 28 01:21:19.842973 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 28 01:21:19.843046 kernel: audit: type=1334 audit(1769563279.840:594): prog-id=196 op=LOAD Jan 28 01:21:19.840000 audit[5078]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.848983 kernel: audit: type=1300 audit(1769563279.840:594): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.840000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.854978 kernel: audit: type=1327 audit(1769563279.840:594): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.840000 audit: BPF prog-id=197 op=LOAD Jan 28 01:21:19.862092 kernel: audit: type=1334 audit(1769563279.840:595): prog-id=197 op=LOAD Jan 28 01:21:19.862177 kernel: audit: type=1300 audit(1769563279.840:595): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.840000 audit[5078]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.867258 kernel: audit: type=1327 audit(1769563279.840:595): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.840000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.840000 audit: BPF prog-id=197 op=UNLOAD Jan 28 01:21:19.874884 kernel: audit: type=1334 audit(1769563279.840:596): prog-id=197 op=UNLOAD Jan 28 01:21:19.874973 kernel: audit: type=1300 audit(1769563279.840:596): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.840000 audit[5078]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.840000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.880917 kernel: audit: type=1327 audit(1769563279.840:596): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.880997 kernel: audit: type=1334 audit(1769563279.840:597): prog-id=196 op=UNLOAD Jan 28 01:21:19.840000 audit: BPF prog-id=196 op=UNLOAD Jan 28 01:21:19.840000 audit[5078]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.840000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.840000 audit: BPF prog-id=198 op=LOAD Jan 28 01:21:19.840000 audit[5078]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4558 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:19.840000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662393665353039616434316162393761363930613937343430616565 Jan 28 01:21:19.899570 containerd[2557]: time="2026-01-28T01:21:19.899541301Z" level=info msg="StartContainer for \"fb96e509ad41ab97a690a97440aee5cc822cbf01be40bada1cadb6ee77d79ac4\" returns successfully" Jan 28 01:21:20.124677 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 28 01:21:20.124769 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 28 01:21:20.209358 kubelet[4038]: I0128 01:21:20.209298 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-r5dkf" podStartSLOduration=1.727661491 podStartE2EDuration="20.209277767s" podCreationTimestamp="2026-01-28 01:21:00 +0000 UTC" firstStartedPulling="2026-01-28 01:21:01.249225105 +0000 UTC m=+21.592843773" lastFinishedPulling="2026-01-28 01:21:19.730841393 +0000 UTC m=+40.074460049" observedRunningTime="2026-01-28 01:21:19.965292378 +0000 UTC m=+40.308911062" watchObservedRunningTime="2026-01-28 01:21:20.209277767 +0000 UTC m=+40.552896459" Jan 28 01:21:20.296671 kubelet[4038]: I0128 01:21:20.296636 4038 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlfcg\" (UniqueName: \"kubernetes.io/projected/61f82efe-84a3-4755-b723-be12f0bfe442-kube-api-access-nlfcg\") pod \"61f82efe-84a3-4755-b723-be12f0bfe442\" (UID: \"61f82efe-84a3-4755-b723-be12f0bfe442\") " Jan 28 01:21:20.296794 kubelet[4038]: I0128 01:21:20.296684 4038 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-ca-bundle\") pod \"61f82efe-84a3-4755-b723-be12f0bfe442\" (UID: \"61f82efe-84a3-4755-b723-be12f0bfe442\") " Jan 28 01:21:20.296794 kubelet[4038]: I0128 01:21:20.296709 4038 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-backend-key-pair\") pod \"61f82efe-84a3-4755-b723-be12f0bfe442\" (UID: \"61f82efe-84a3-4755-b723-be12f0bfe442\") " Jan 28 01:21:20.297688 kubelet[4038]: I0128 01:21:20.297572 4038 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "61f82efe-84a3-4755-b723-be12f0bfe442" (UID: "61f82efe-84a3-4755-b723-be12f0bfe442"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 28 01:21:20.300077 kubelet[4038]: I0128 01:21:20.299993 4038 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "61f82efe-84a3-4755-b723-be12f0bfe442" (UID: "61f82efe-84a3-4755-b723-be12f0bfe442"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 28 01:21:20.300934 kubelet[4038]: I0128 01:21:20.300876 4038 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f82efe-84a3-4755-b723-be12f0bfe442-kube-api-access-nlfcg" (OuterVolumeSpecName: "kube-api-access-nlfcg") pod "61f82efe-84a3-4755-b723-be12f0bfe442" (UID: "61f82efe-84a3-4755-b723-be12f0bfe442"). InnerVolumeSpecName "kube-api-access-nlfcg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 28 01:21:20.397491 kubelet[4038]: I0128 01:21:20.397463 4038 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-ca-bundle\") on node \"ci-4593.0.0-n-84a137a86c\" DevicePath \"\"" Jan 28 01:21:20.397491 kubelet[4038]: I0128 01:21:20.397487 4038 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/61f82efe-84a3-4755-b723-be12f0bfe442-whisker-backend-key-pair\") on node \"ci-4593.0.0-n-84a137a86c\" DevicePath \"\"" Jan 28 01:21:20.397491 kubelet[4038]: I0128 01:21:20.397497 4038 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlfcg\" (UniqueName: \"kubernetes.io/projected/61f82efe-84a3-4755-b723-be12f0bfe442-kube-api-access-nlfcg\") on node \"ci-4593.0.0-n-84a137a86c\" DevicePath \"\"" Jan 28 01:21:20.687688 systemd[1]: var-lib-kubelet-pods-61f82efe\x2d84a3\x2d4755\x2db723\x2dbe12f0bfe442-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 28 01:21:20.687768 systemd[1]: var-lib-kubelet-pods-61f82efe\x2d84a3\x2d4755\x2db723\x2dbe12f0bfe442-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnlfcg.mount: Deactivated successfully. Jan 28 01:21:20.951173 systemd[1]: Removed slice kubepods-besteffort-pod61f82efe_84a3_4755_b723_be12f0bfe442.slice - libcontainer container kubepods-besteffort-pod61f82efe_84a3_4755_b723_be12f0bfe442.slice. Jan 28 01:21:21.041943 systemd[1]: Created slice kubepods-besteffort-pod71a93f75_99db_41f8_a193_bdcc3af98dc1.slice - libcontainer container kubepods-besteffort-pod71a93f75_99db_41f8_a193_bdcc3af98dc1.slice. Jan 28 01:21:21.101611 kubelet[4038]: I0128 01:21:21.101582 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sdzp\" (UniqueName: \"kubernetes.io/projected/71a93f75-99db-41f8-a193-bdcc3af98dc1-kube-api-access-2sdzp\") pod \"whisker-6f6448978d-8pkwl\" (UID: \"71a93f75-99db-41f8-a193-bdcc3af98dc1\") " pod="calico-system/whisker-6f6448978d-8pkwl" Jan 28 01:21:21.101611 kubelet[4038]: I0128 01:21:21.101616 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71a93f75-99db-41f8-a193-bdcc3af98dc1-whisker-ca-bundle\") pod \"whisker-6f6448978d-8pkwl\" (UID: \"71a93f75-99db-41f8-a193-bdcc3af98dc1\") " pod="calico-system/whisker-6f6448978d-8pkwl" Jan 28 01:21:21.101797 kubelet[4038]: I0128 01:21:21.101632 4038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/71a93f75-99db-41f8-a193-bdcc3af98dc1-whisker-backend-key-pair\") pod \"whisker-6f6448978d-8pkwl\" (UID: \"71a93f75-99db-41f8-a193-bdcc3af98dc1\") " pod="calico-system/whisker-6f6448978d-8pkwl" Jan 28 01:21:21.350123 containerd[2557]: time="2026-01-28T01:21:21.350028553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f6448978d-8pkwl,Uid:71a93f75-99db-41f8-a193-bdcc3af98dc1,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:21.542727 systemd-networkd[2191]: calic060f752078: Link UP Jan 28 01:21:21.543829 systemd-networkd[2191]: calic060f752078: Gained carrier Jan 28 01:21:21.572923 containerd[2557]: 2026-01-28 01:21:21.392 [INFO][5217] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 28 01:21:21.572923 containerd[2557]: 2026-01-28 01:21:21.404 [INFO][5217] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0 whisker-6f6448978d- calico-system 71a93f75-99db-41f8-a193-bdcc3af98dc1 904 0 2026-01-28 01:21:21 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6f6448978d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c whisker-6f6448978d-8pkwl eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calic060f752078 [] [] }} ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-" Jan 28 01:21:21.572923 containerd[2557]: 2026-01-28 01:21:21.404 [INFO][5217] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" Jan 28 01:21:21.572923 containerd[2557]: 2026-01-28 01:21:21.443 [INFO][5255] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" HandleID="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Workload="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.443 [INFO][5255] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" HandleID="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Workload="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f070), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593.0.0-n-84a137a86c", "pod":"whisker-6f6448978d-8pkwl", "timestamp":"2026-01-28 01:21:21.443426633 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.443 [INFO][5255] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.443 [INFO][5255] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.443 [INFO][5255] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.454 [INFO][5255] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.457 [INFO][5255] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.461 [INFO][5255] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.464 [INFO][5255] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573232 containerd[2557]: 2026-01-28 01:21:21.466 [INFO][5255] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573433 containerd[2557]: 2026-01-28 01:21:21.466 [INFO][5255] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573433 containerd[2557]: 2026-01-28 01:21:21.467 [INFO][5255] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e Jan 28 01:21:21.573433 containerd[2557]: 2026-01-28 01:21:21.472 [INFO][5255] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573433 containerd[2557]: 2026-01-28 01:21:21.480 [INFO][5255] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.1/26] block=192.168.98.0/26 handle="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573433 containerd[2557]: 2026-01-28 01:21:21.480 [INFO][5255] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.1/26] handle="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:21.573433 containerd[2557]: 2026-01-28 01:21:21.480 [INFO][5255] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:21.573433 containerd[2557]: 2026-01-28 01:21:21.480 [INFO][5255] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.1/26] IPv6=[] ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" HandleID="k8s-pod-network.78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Workload="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" Jan 28 01:21:21.573564 containerd[2557]: 2026-01-28 01:21:21.485 [INFO][5217] cni-plugin/k8s.go 418: Populated endpoint ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0", GenerateName:"whisker-6f6448978d-", Namespace:"calico-system", SelfLink:"", UID:"71a93f75-99db-41f8-a193-bdcc3af98dc1", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 21, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f6448978d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"whisker-6f6448978d-8pkwl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic060f752078", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:21.573564 containerd[2557]: 2026-01-28 01:21:21.485 [INFO][5217] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.1/32] ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" Jan 28 01:21:21.573646 containerd[2557]: 2026-01-28 01:21:21.485 [INFO][5217] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic060f752078 ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" Jan 28 01:21:21.573646 containerd[2557]: 2026-01-28 01:21:21.544 [INFO][5217] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" Jan 28 01:21:21.573688 containerd[2557]: 2026-01-28 01:21:21.544 [INFO][5217] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0", GenerateName:"whisker-6f6448978d-", Namespace:"calico-system", SelfLink:"", UID:"71a93f75-99db-41f8-a193-bdcc3af98dc1", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 21, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f6448978d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e", Pod:"whisker-6f6448978d-8pkwl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic060f752078", MAC:"f6:17:7a:e8:43:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:21.573746 containerd[2557]: 2026-01-28 01:21:21.569 [INFO][5217] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" Namespace="calico-system" Pod="whisker-6f6448978d-8pkwl" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-whisker--6f6448978d--8pkwl-eth0" Jan 28 01:21:21.622032 containerd[2557]: time="2026-01-28T01:21:21.621490543Z" level=info msg="connecting to shim 78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e" address="unix:///run/containerd/s/3a0bdae91eb8b735a10c7d85049186f25de5bd3bbad0eff1f9d5b365328db356" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:21.650516 systemd[1]: Started cri-containerd-78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e.scope - libcontainer container 78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e. Jan 28 01:21:21.668000 audit: BPF prog-id=199 op=LOAD Jan 28 01:21:21.669000 audit: BPF prog-id=200 op=LOAD Jan 28 01:21:21.669000 audit[5334]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=5320 pid=5334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738626537653866333734313764393563626365626239323635333965 Jan 28 01:21:21.669000 audit: BPF prog-id=200 op=UNLOAD Jan 28 01:21:21.669000 audit[5334]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5320 pid=5334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738626537653866333734313764393563626365626239323635333965 Jan 28 01:21:21.670000 audit: BPF prog-id=201 op=LOAD Jan 28 01:21:21.670000 audit[5334]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=5320 pid=5334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738626537653866333734313764393563626365626239323635333965 Jan 28 01:21:21.670000 audit: BPF prog-id=202 op=LOAD Jan 28 01:21:21.670000 audit[5334]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=5320 pid=5334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738626537653866333734313764393563626365626239323635333965 Jan 28 01:21:21.670000 audit: BPF prog-id=202 op=UNLOAD Jan 28 01:21:21.670000 audit[5334]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=5320 pid=5334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738626537653866333734313764393563626365626239323635333965 Jan 28 01:21:21.670000 audit: BPF prog-id=201 op=UNLOAD Jan 28 01:21:21.670000 audit[5334]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5320 pid=5334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738626537653866333734313764393563626365626239323635333965 Jan 28 01:21:21.670000 audit: BPF prog-id=203 op=LOAD Jan 28 01:21:21.670000 audit[5334]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=5320 pid=5334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738626537653866333734313764393563626365626239323635333965 Jan 28 01:21:21.737444 containerd[2557]: time="2026-01-28T01:21:21.737407092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f6448978d-8pkwl,Uid:71a93f75-99db-41f8-a193-bdcc3af98dc1,Namespace:calico-system,Attempt:0,} returns sandbox id \"78be7e8f37417d95cbcebb926539ebd7e24983479e9650fa96d02825cb30437e\"" Jan 28 01:21:21.739972 containerd[2557]: time="2026-01-28T01:21:21.739313875Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:21:21.742699 kubelet[4038]: I0128 01:21:21.742663 4038 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f82efe-84a3-4755-b723-be12f0bfe442" path="/var/lib/kubelet/pods/61f82efe-84a3-4755-b723-be12f0bfe442/volumes" Jan 28 01:21:21.838000 audit: BPF prog-id=204 op=LOAD Jan 28 01:21:21.838000 audit[5386]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffeb72eb3e0 a2=98 a3=1fffffffffffffff items=0 ppid=5201 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:21:21.838000 audit: BPF prog-id=204 op=UNLOAD Jan 28 01:21:21.838000 audit[5386]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffeb72eb3b0 a3=0 items=0 ppid=5201 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:21:21.838000 audit: BPF prog-id=205 op=LOAD Jan 28 01:21:21.838000 audit[5386]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffeb72eb2c0 a2=94 a3=3 items=0 ppid=5201 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:21:21.838000 audit: BPF prog-id=205 op=UNLOAD Jan 28 01:21:21.838000 audit[5386]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffeb72eb2c0 a2=94 a3=3 items=0 ppid=5201 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:21:21.838000 audit: BPF prog-id=206 op=LOAD Jan 28 01:21:21.838000 audit[5386]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffeb72eb300 a2=94 a3=7ffeb72eb4e0 items=0 ppid=5201 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:21:21.838000 audit: BPF prog-id=206 op=UNLOAD Jan 28 01:21:21.838000 audit[5386]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffeb72eb300 a2=94 a3=7ffeb72eb4e0 items=0 ppid=5201 pid=5386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 01:21:21.841000 audit: BPF prog-id=207 op=LOAD Jan 28 01:21:21.841000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdbb1b40a0 a2=98 a3=3 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.841000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.841000 audit: BPF prog-id=207 op=UNLOAD Jan 28 01:21:21.841000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffdbb1b4070 a3=0 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.841000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.841000 audit: BPF prog-id=208 op=LOAD Jan 28 01:21:21.841000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdbb1b3e90 a2=94 a3=54428f items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.841000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.841000 audit: BPF prog-id=208 op=UNLOAD Jan 28 01:21:21.841000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdbb1b3e90 a2=94 a3=54428f items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.841000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.842000 audit: BPF prog-id=209 op=LOAD Jan 28 01:21:21.842000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdbb1b3ec0 a2=94 a3=2 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.842000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.842000 audit: BPF prog-id=209 op=UNLOAD Jan 28 01:21:21.842000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdbb1b3ec0 a2=0 a3=2 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.842000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.949000 audit: BPF prog-id=210 op=LOAD Jan 28 01:21:21.949000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdbb1b3d80 a2=94 a3=1 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.949000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.949000 audit: BPF prog-id=210 op=UNLOAD Jan 28 01:21:21.949000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdbb1b3d80 a2=94 a3=1 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.949000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.958000 audit: BPF prog-id=211 op=LOAD Jan 28 01:21:21.958000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdbb1b3d70 a2=94 a3=4 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.958000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.958000 audit: BPF prog-id=211 op=UNLOAD Jan 28 01:21:21.958000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffdbb1b3d70 a2=0 a3=4 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.958000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.958000 audit: BPF prog-id=212 op=LOAD Jan 28 01:21:21.958000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffdbb1b3bd0 a2=94 a3=5 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.958000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.958000 audit: BPF prog-id=212 op=UNLOAD Jan 28 01:21:21.958000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffdbb1b3bd0 a2=0 a3=5 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.958000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.958000 audit: BPF prog-id=213 op=LOAD Jan 28 01:21:21.958000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdbb1b3df0 a2=94 a3=6 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.958000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.958000 audit: BPF prog-id=213 op=UNLOAD Jan 28 01:21:21.958000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffdbb1b3df0 a2=0 a3=6 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.958000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.958000 audit: BPF prog-id=214 op=LOAD Jan 28 01:21:21.958000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdbb1b35a0 a2=94 a3=88 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.958000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.959000 audit: BPF prog-id=215 op=LOAD Jan 28 01:21:21.959000 audit[5387]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffdbb1b3420 a2=94 a3=2 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.959000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.959000 audit: BPF prog-id=215 op=UNLOAD Jan 28 01:21:21.959000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffdbb1b3450 a2=0 a3=7ffdbb1b3550 items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.959000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.959000 audit: BPF prog-id=214 op=UNLOAD Jan 28 01:21:21.959000 audit[5387]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7bf4d10 a2=0 a3=88ad47fabfc8aec items=0 ppid=5201 pid=5387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.959000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 01:21:21.965000 audit: BPF prog-id=216 op=LOAD Jan 28 01:21:21.965000 audit[5390]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea0e3e690 a2=98 a3=1999999999999999 items=0 ppid=5201 pid=5390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.965000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:21:21.965000 audit: BPF prog-id=216 op=UNLOAD Jan 28 01:21:21.965000 audit[5390]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffea0e3e660 a3=0 items=0 ppid=5201 pid=5390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.965000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:21:21.965000 audit: BPF prog-id=217 op=LOAD Jan 28 01:21:21.965000 audit[5390]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea0e3e570 a2=94 a3=ffff items=0 ppid=5201 pid=5390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.965000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:21:21.965000 audit: BPF prog-id=217 op=UNLOAD Jan 28 01:21:21.965000 audit[5390]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffea0e3e570 a2=94 a3=ffff items=0 ppid=5201 pid=5390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.965000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:21:21.965000 audit: BPF prog-id=218 op=LOAD Jan 28 01:21:21.965000 audit[5390]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffea0e3e5b0 a2=94 a3=7ffea0e3e790 items=0 ppid=5201 pid=5390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.965000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:21:21.965000 audit: BPF prog-id=218 op=UNLOAD Jan 28 01:21:21.965000 audit[5390]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffea0e3e5b0 a2=94 a3=7ffea0e3e790 items=0 ppid=5201 pid=5390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:21.965000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 01:21:22.042111 containerd[2557]: time="2026-01-28T01:21:22.042078042Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:22.044775 containerd[2557]: time="2026-01-28T01:21:22.044741079Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:21:22.044984 containerd[2557]: time="2026-01-28T01:21:22.044825461Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:22.045124 kubelet[4038]: E0128 01:21:22.044944 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:21:22.045124 kubelet[4038]: E0128 01:21:22.045060 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:21:22.046147 kubelet[4038]: E0128 01:21:22.045909 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0c0e29983b554c87af1b31cc149295e5,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:22.047872 containerd[2557]: time="2026-01-28T01:21:22.047796732Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:21:22.051628 systemd-networkd[2191]: vxlan.calico: Link UP Jan 28 01:21:22.051634 systemd-networkd[2191]: vxlan.calico: Gained carrier Jan 28 01:21:22.069000 audit: BPF prog-id=219 op=LOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffde9296f10 a2=98 a3=0 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=219 op=UNLOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffde9296ee0 a3=0 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=220 op=LOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffde9296d20 a2=94 a3=54428f items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=220 op=UNLOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffde9296d20 a2=94 a3=54428f items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=221 op=LOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffde9296d50 a2=94 a3=2 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=221 op=UNLOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffde9296d50 a2=0 a3=2 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=222 op=LOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffde9296b00 a2=94 a3=4 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=222 op=UNLOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffde9296b00 a2=94 a3=4 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=223 op=LOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffde9296c00 a2=94 a3=7ffde9296d80 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=223 op=UNLOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffde9296c00 a2=0 a3=7ffde9296d80 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=224 op=LOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffde9296330 a2=94 a3=2 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=224 op=UNLOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffde9296330 a2=0 a3=2 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.069000 audit: BPF prog-id=225 op=LOAD Jan 28 01:21:22.069000 audit[5416]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffde9296430 a2=94 a3=30 items=0 ppid=5201 pid=5416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.069000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 01:21:22.079000 audit: BPF prog-id=226 op=LOAD Jan 28 01:21:22.079000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc36621f40 a2=98 a3=0 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.079000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.079000 audit: BPF prog-id=226 op=UNLOAD Jan 28 01:21:22.079000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc36621f10 a3=0 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.079000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.079000 audit: BPF prog-id=227 op=LOAD Jan 28 01:21:22.079000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc36621d30 a2=94 a3=54428f items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.079000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.079000 audit: BPF prog-id=227 op=UNLOAD Jan 28 01:21:22.079000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffc36621d30 a2=94 a3=54428f items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.079000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.079000 audit: BPF prog-id=228 op=LOAD Jan 28 01:21:22.079000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc36621d60 a2=94 a3=2 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.079000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.079000 audit: BPF prog-id=228 op=UNLOAD Jan 28 01:21:22.079000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffc36621d60 a2=0 a3=2 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.079000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.186000 audit: BPF prog-id=229 op=LOAD Jan 28 01:21:22.186000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc36621c20 a2=94 a3=1 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.186000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.186000 audit: BPF prog-id=229 op=UNLOAD Jan 28 01:21:22.186000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffc36621c20 a2=94 a3=1 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.186000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.194000 audit: BPF prog-id=230 op=LOAD Jan 28 01:21:22.194000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffc36621c10 a2=94 a3=4 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.194000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.194000 audit: BPF prog-id=230 op=UNLOAD Jan 28 01:21:22.194000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffc36621c10 a2=0 a3=4 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.194000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.194000 audit: BPF prog-id=231 op=LOAD Jan 28 01:21:22.194000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc36621a70 a2=94 a3=5 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.194000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.194000 audit: BPF prog-id=231 op=UNLOAD Jan 28 01:21:22.194000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc36621a70 a2=0 a3=5 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.194000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.194000 audit: BPF prog-id=232 op=LOAD Jan 28 01:21:22.194000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffc36621c90 a2=94 a3=6 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.194000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.194000 audit: BPF prog-id=232 op=UNLOAD Jan 28 01:21:22.194000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffc36621c90 a2=0 a3=6 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.194000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.194000 audit: BPF prog-id=233 op=LOAD Jan 28 01:21:22.194000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffc36621440 a2=94 a3=88 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.194000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.195000 audit: BPF prog-id=234 op=LOAD Jan 28 01:21:22.195000 audit[5421]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffc366212c0 a2=94 a3=2 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.195000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.195000 audit: BPF prog-id=234 op=UNLOAD Jan 28 01:21:22.195000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffc366212f0 a2=0 a3=7ffc366213f0 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.195000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.195000 audit: BPF prog-id=233 op=UNLOAD Jan 28 01:21:22.195000 audit[5421]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=e6ead10 a2=0 a3=b4d29f2b9b6e5f41 items=0 ppid=5201 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.195000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 01:21:22.199000 audit: BPF prog-id=225 op=UNLOAD Jan 28 01:21:22.199000 audit[5201]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c0013907c0 a2=0 a3=0 items=0 ppid=5196 pid=5201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.199000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 28 01:21:22.288000 audit[5444]: NETFILTER_CFG table=mangle:122 family=2 entries=16 op=nft_register_chain pid=5444 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:22.288000 audit[5444]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffc3740a500 a2=0 a3=7ffc3740a4ec items=0 ppid=5201 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.288000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:22.294000 audit[5445]: NETFILTER_CFG table=nat:123 family=2 entries=15 op=nft_register_chain pid=5445 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:22.294000 audit[5445]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffc1aba2300 a2=0 a3=7ffc1aba22ec items=0 ppid=5201 pid=5445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.294000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:22.302898 containerd[2557]: time="2026-01-28T01:21:22.302862802Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:22.306340 containerd[2557]: time="2026-01-28T01:21:22.306306769Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:21:22.306393 containerd[2557]: time="2026-01-28T01:21:22.306381598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:22.306582 kubelet[4038]: E0128 01:21:22.306499 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:21:22.306582 kubelet[4038]: E0128 01:21:22.306553 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:21:22.306732 kubelet[4038]: E0128 01:21:22.306681 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:22.307865 kubelet[4038]: E0128 01:21:22.307827 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:21:22.310000 audit[5442]: NETFILTER_CFG table=raw:124 family=2 entries=21 op=nft_register_chain pid=5442 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:22.310000 audit[5442]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7fff3320ed40 a2=0 a3=7fff3320ed2c items=0 ppid=5201 pid=5442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.310000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:22.312000 audit[5446]: NETFILTER_CFG table=filter:125 family=2 entries=94 op=nft_register_chain pid=5446 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:22.312000 audit[5446]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7fff1f12d4a0 a2=0 a3=7fff1f12d48c items=0 ppid=5201 pid=5446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.312000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:22.949932 kubelet[4038]: E0128 01:21:22.949885 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:21:22.983000 audit[5460]: NETFILTER_CFG table=filter:126 family=2 entries=20 op=nft_register_rule pid=5460 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:22.983000 audit[5460]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcf11a0670 a2=0 a3=7ffcf11a065c items=0 ppid=4145 pid=5460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.983000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:22.987000 audit[5460]: NETFILTER_CFG table=nat:127 family=2 entries=14 op=nft_register_rule pid=5460 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:22.987000 audit[5460]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffcf11a0670 a2=0 a3=0 items=0 ppid=4145 pid=5460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:22.987000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:23.215130 systemd-networkd[2191]: calic060f752078: Gained IPv6LL Jan 28 01:21:23.343090 systemd-networkd[2191]: vxlan.calico: Gained IPv6LL Jan 28 01:21:25.742961 containerd[2557]: time="2026-01-28T01:21:25.742916057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-p46qn,Uid:e5b3670b-e45e-4499-a525-031c765e8a68,Namespace:kube-system,Attempt:0,}" Jan 28 01:21:25.743367 containerd[2557]: time="2026-01-28T01:21:25.742919786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-bg2b5,Uid:4b861133-0274-4274-bab9-748410e42edc,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:21:25.875117 systemd-networkd[2191]: califda158e8851: Link UP Jan 28 01:21:25.875849 systemd-networkd[2191]: califda158e8851: Gained carrier Jan 28 01:21:25.898654 containerd[2557]: 2026-01-28 01:21:25.819 [INFO][5463] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0 coredns-674b8bbfcf- kube-system e5b3670b-e45e-4499-a525-031c765e8a68 827 0 2026-01-28 01:20:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c coredns-674b8bbfcf-p46qn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califda158e8851 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-" Jan 28 01:21:25.898654 containerd[2557]: 2026-01-28 01:21:25.819 [INFO][5463] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" Jan 28 01:21:25.898654 containerd[2557]: 2026-01-28 01:21:25.846 [INFO][5491] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" HandleID="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Workload="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.846 [INFO][5491] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" HandleID="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Workload="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593.0.0-n-84a137a86c", "pod":"coredns-674b8bbfcf-p46qn", "timestamp":"2026-01-28 01:21:25.846007306 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.846 [INFO][5491] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.846 [INFO][5491] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.846 [INFO][5491] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.851 [INFO][5491] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.854 [INFO][5491] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.857 [INFO][5491] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.858 [INFO][5491] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899109 containerd[2557]: 2026-01-28 01:21:25.859 [INFO][5491] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899337 containerd[2557]: 2026-01-28 01:21:25.859 [INFO][5491] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899337 containerd[2557]: 2026-01-28 01:21:25.860 [INFO][5491] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86 Jan 28 01:21:25.899337 containerd[2557]: 2026-01-28 01:21:25.864 [INFO][5491] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899337 containerd[2557]: 2026-01-28 01:21:25.869 [INFO][5491] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.2/26] block=192.168.98.0/26 handle="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899337 containerd[2557]: 2026-01-28 01:21:25.869 [INFO][5491] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.2/26] handle="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:25.899337 containerd[2557]: 2026-01-28 01:21:25.869 [INFO][5491] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:25.899337 containerd[2557]: 2026-01-28 01:21:25.869 [INFO][5491] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.2/26] IPv6=[] ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" HandleID="k8s-pod-network.3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Workload="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" Jan 28 01:21:25.899475 containerd[2557]: 2026-01-28 01:21:25.872 [INFO][5463] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e5b3670b-e45e-4499-a525-031c765e8a68", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"coredns-674b8bbfcf-p46qn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califda158e8851", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:25.899475 containerd[2557]: 2026-01-28 01:21:25.872 [INFO][5463] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.2/32] ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" Jan 28 01:21:25.899475 containerd[2557]: 2026-01-28 01:21:25.872 [INFO][5463] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califda158e8851 ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" Jan 28 01:21:25.899475 containerd[2557]: 2026-01-28 01:21:25.875 [INFO][5463] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" Jan 28 01:21:25.899475 containerd[2557]: 2026-01-28 01:21:25.876 [INFO][5463] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e5b3670b-e45e-4499-a525-031c765e8a68", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86", Pod:"coredns-674b8bbfcf-p46qn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califda158e8851", MAC:"26:af:ae:12:7f:98", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:25.899475 containerd[2557]: 2026-01-28 01:21:25.897 [INFO][5463] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" Namespace="kube-system" Pod="coredns-674b8bbfcf-p46qn" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--p46qn-eth0" Jan 28 01:21:25.908000 audit[5515]: NETFILTER_CFG table=filter:128 family=2 entries=42 op=nft_register_chain pid=5515 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:25.912567 kernel: kauditd_printk_skb: 231 callbacks suppressed Jan 28 01:21:25.912651 kernel: audit: type=1325 audit(1769563285.908:675): table=filter:128 family=2 entries=42 op=nft_register_chain pid=5515 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:25.908000 audit[5515]: SYSCALL arch=c000003e syscall=46 success=yes exit=22552 a0=3 a1=7fff799cc530 a2=0 a3=7fff799cc51c items=0 ppid=5201 pid=5515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.918817 kernel: audit: type=1300 audit(1769563285.908:675): arch=c000003e syscall=46 success=yes exit=22552 a0=3 a1=7fff799cc530 a2=0 a3=7fff799cc51c items=0 ppid=5201 pid=5515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.908000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:25.922772 kernel: audit: type=1327 audit(1769563285.908:675): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:25.932030 containerd[2557]: time="2026-01-28T01:21:25.931577726Z" level=info msg="connecting to shim 3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86" address="unix:///run/containerd/s/a2fbaf728f9d052f87a6dd384f581f59f9a98c34b91be453bbd3b7defe9af7e2" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:25.951166 systemd[1]: Started cri-containerd-3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86.scope - libcontainer container 3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86. Jan 28 01:21:25.962000 audit: BPF prog-id=235 op=LOAD Jan 28 01:21:25.964986 kernel: audit: type=1334 audit(1769563285.962:676): prog-id=235 op=LOAD Jan 28 01:21:25.963000 audit: BPF prog-id=236 op=LOAD Jan 28 01:21:25.963000 audit[5536]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.973002 kernel: audit: type=1334 audit(1769563285.963:677): prog-id=236 op=LOAD Jan 28 01:21:25.973062 kernel: audit: type=1300 audit(1769563285.963:677): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.980378 kernel: audit: type=1327 audit(1769563285.963:677): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.963000 audit: BPF prog-id=236 op=UNLOAD Jan 28 01:21:25.983208 kernel: audit: type=1334 audit(1769563285.963:678): prog-id=236 op=UNLOAD Jan 28 01:21:25.989795 kernel: audit: type=1300 audit(1769563285.963:678): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.963000 audit[5536]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.992003 systemd-networkd[2191]: caliac919477414: Link UP Jan 28 01:21:25.992315 systemd-networkd[2191]: caliac919477414: Gained carrier Jan 28 01:21:25.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.963000 audit: BPF prog-id=237 op=LOAD Jan 28 01:21:25.963000 audit[5536]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.963000 audit: BPF prog-id=238 op=LOAD Jan 28 01:21:25.963000 audit[5536]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.964000 audit: BPF prog-id=238 op=UNLOAD Jan 28 01:21:25.964000 audit[5536]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.964000 audit: BPF prog-id=237 op=UNLOAD Jan 28 01:21:26.000034 kernel: audit: type=1327 audit(1769563285.963:678): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.964000 audit[5536]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:25.964000 audit: BPF prog-id=239 op=LOAD Jan 28 01:21:25.964000 audit[5536]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=5525 pid=5536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:25.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932393965626532333163313364666531393630313434643261 Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.819 [INFO][5467] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0 calico-apiserver-db6789d8- calico-apiserver 4b861133-0274-4274-bab9-748410e42edc 829 0 2026-01-28 01:20:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:db6789d8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c calico-apiserver-db6789d8-bg2b5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliac919477414 [] [] }} ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.819 [INFO][5467] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.848 [INFO][5492] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" HandleID="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.848 [INFO][5492] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" HandleID="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ae020), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593.0.0-n-84a137a86c", "pod":"calico-apiserver-db6789d8-bg2b5", "timestamp":"2026-01-28 01:21:25.848158041 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.848 [INFO][5492] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.869 [INFO][5492] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.869 [INFO][5492] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.952 [INFO][5492] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.958 [INFO][5492] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.962 [INFO][5492] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.965 [INFO][5492] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.968 [INFO][5492] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.968 [INFO][5492] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.969 [INFO][5492] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.975 [INFO][5492] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.985 [INFO][5492] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.3/26] block=192.168.98.0/26 handle="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.985 [INFO][5492] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.3/26] handle="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.986 [INFO][5492] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:26.009794 containerd[2557]: 2026-01-28 01:21:25.986 [INFO][5492] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.3/26] IPv6=[] ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" HandleID="k8s-pod-network.b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" Jan 28 01:21:26.010311 containerd[2557]: 2026-01-28 01:21:25.990 [INFO][5467] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0", GenerateName:"calico-apiserver-db6789d8-", Namespace:"calico-apiserver", SelfLink:"", UID:"4b861133-0274-4274-bab9-748410e42edc", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"db6789d8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"calico-apiserver-db6789d8-bg2b5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliac919477414", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:26.010311 containerd[2557]: 2026-01-28 01:21:25.990 [INFO][5467] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.3/32] ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" Jan 28 01:21:26.010311 containerd[2557]: 2026-01-28 01:21:25.990 [INFO][5467] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliac919477414 ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" Jan 28 01:21:26.010311 containerd[2557]: 2026-01-28 01:21:25.992 [INFO][5467] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" Jan 28 01:21:26.010311 containerd[2557]: 2026-01-28 01:21:25.992 [INFO][5467] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0", GenerateName:"calico-apiserver-db6789d8-", Namespace:"calico-apiserver", SelfLink:"", UID:"4b861133-0274-4274-bab9-748410e42edc", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"db6789d8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c", Pod:"calico-apiserver-db6789d8-bg2b5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliac919477414", MAC:"62:4f:61:70:4a:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:26.010311 containerd[2557]: 2026-01-28 01:21:26.008 [INFO][5467] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-bg2b5" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--bg2b5-eth0" Jan 28 01:21:26.026000 audit[5564]: NETFILTER_CFG table=filter:129 family=2 entries=54 op=nft_register_chain pid=5564 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:26.026000 audit[5564]: SYSCALL arch=c000003e syscall=46 success=yes exit=29396 a0=3 a1=7fff73ff6800 a2=0 a3=7fff73ff67ec items=0 ppid=5201 pid=5564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.026000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:26.035567 containerd[2557]: time="2026-01-28T01:21:26.035535359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-p46qn,Uid:e5b3670b-e45e-4499-a525-031c765e8a68,Namespace:kube-system,Attempt:0,} returns sandbox id \"3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86\"" Jan 28 01:21:26.041390 containerd[2557]: time="2026-01-28T01:21:26.041359084Z" level=info msg="CreateContainer within sandbox \"3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 28 01:21:26.057712 containerd[2557]: time="2026-01-28T01:21:26.057682263Z" level=info msg="connecting to shim b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c" address="unix:///run/containerd/s/4dcc50b91249ded020c5a48dad1efbaf3c06cb0446dc00d62fc34fb9b90e5af3" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:26.061095 containerd[2557]: time="2026-01-28T01:21:26.061068814Z" level=info msg="Container 58ab572066b3ad9dd525540c62db9949c929c9edf6f12adfd4b3e65abf182fe1: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:21:26.073111 systemd[1]: Started cri-containerd-b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c.scope - libcontainer container b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c. Jan 28 01:21:26.079000 audit: BPF prog-id=240 op=LOAD Jan 28 01:21:26.079000 audit: BPF prog-id=241 op=LOAD Jan 28 01:21:26.079000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=5580 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.079000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234396638353364326638366639623036303961386661326132616261 Jan 28 01:21:26.079000 audit: BPF prog-id=241 op=UNLOAD Jan 28 01:21:26.079000 audit[5591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5580 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.079000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234396638353364326638366639623036303961386661326132616261 Jan 28 01:21:26.079000 audit: BPF prog-id=242 op=LOAD Jan 28 01:21:26.079000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=5580 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.079000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234396638353364326638366639623036303961386661326132616261 Jan 28 01:21:26.079000 audit: BPF prog-id=243 op=LOAD Jan 28 01:21:26.079000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=5580 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.079000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234396638353364326638366639623036303961386661326132616261 Jan 28 01:21:26.079000 audit: BPF prog-id=243 op=UNLOAD Jan 28 01:21:26.079000 audit[5591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5580 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.079000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234396638353364326638366639623036303961386661326132616261 Jan 28 01:21:26.079000 audit: BPF prog-id=242 op=UNLOAD Jan 28 01:21:26.079000 audit[5591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5580 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.079000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234396638353364326638366639623036303961386661326132616261 Jan 28 01:21:26.079000 audit: BPF prog-id=244 op=LOAD Jan 28 01:21:26.079000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=5580 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.079000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234396638353364326638366639623036303961386661326132616261 Jan 28 01:21:26.107670 containerd[2557]: time="2026-01-28T01:21:26.107641153Z" level=info msg="CreateContainer within sandbox \"3199299ebe231c13dfe1960144d2a13a53ccf3ccd73642e92e88d3a41b427a86\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"58ab572066b3ad9dd525540c62db9949c929c9edf6f12adfd4b3e65abf182fe1\"" Jan 28 01:21:26.108990 containerd[2557]: time="2026-01-28T01:21:26.108791081Z" level=info msg="StartContainer for \"58ab572066b3ad9dd525540c62db9949c929c9edf6f12adfd4b3e65abf182fe1\"" Jan 28 01:21:26.110939 containerd[2557]: time="2026-01-28T01:21:26.110596419Z" level=info msg="connecting to shim 58ab572066b3ad9dd525540c62db9949c929c9edf6f12adfd4b3e65abf182fe1" address="unix:///run/containerd/s/a2fbaf728f9d052f87a6dd384f581f59f9a98c34b91be453bbd3b7defe9af7e2" protocol=ttrpc version=3 Jan 28 01:21:26.113804 containerd[2557]: time="2026-01-28T01:21:26.113775744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-bg2b5,Uid:4b861133-0274-4274-bab9-748410e42edc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b49f853d2f86f9b0609a8fa2a2aba3a512b02810481bb29d7e19604869ac996c\"" Jan 28 01:21:26.115971 containerd[2557]: time="2026-01-28T01:21:26.115849945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:21:26.133103 systemd[1]: Started cri-containerd-58ab572066b3ad9dd525540c62db9949c929c9edf6f12adfd4b3e65abf182fe1.scope - libcontainer container 58ab572066b3ad9dd525540c62db9949c929c9edf6f12adfd4b3e65abf182fe1. Jan 28 01:21:26.140000 audit: BPF prog-id=245 op=LOAD Jan 28 01:21:26.141000 audit: BPF prog-id=246 op=LOAD Jan 28 01:21:26.141000 audit[5618]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=5525 pid=5618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538616235373230363662336164396464353235353430633632646239 Jan 28 01:21:26.141000 audit: BPF prog-id=246 op=UNLOAD Jan 28 01:21:26.141000 audit[5618]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5525 pid=5618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538616235373230363662336164396464353235353430633632646239 Jan 28 01:21:26.141000 audit: BPF prog-id=247 op=LOAD Jan 28 01:21:26.141000 audit[5618]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=5525 pid=5618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538616235373230363662336164396464353235353430633632646239 Jan 28 01:21:26.141000 audit: BPF prog-id=248 op=LOAD Jan 28 01:21:26.141000 audit[5618]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=5525 pid=5618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538616235373230363662336164396464353235353430633632646239 Jan 28 01:21:26.141000 audit: BPF prog-id=248 op=UNLOAD Jan 28 01:21:26.141000 audit[5618]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5525 pid=5618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538616235373230363662336164396464353235353430633632646239 Jan 28 01:21:26.141000 audit: BPF prog-id=247 op=UNLOAD Jan 28 01:21:26.141000 audit[5618]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5525 pid=5618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538616235373230363662336164396464353235353430633632646239 Jan 28 01:21:26.141000 audit: BPF prog-id=249 op=LOAD Jan 28 01:21:26.141000 audit[5618]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=5525 pid=5618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.141000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3538616235373230363662336164396464353235353430633632646239 Jan 28 01:21:26.156842 containerd[2557]: time="2026-01-28T01:21:26.156779227Z" level=info msg="StartContainer for \"58ab572066b3ad9dd525540c62db9949c929c9edf6f12adfd4b3e65abf182fe1\" returns successfully" Jan 28 01:21:26.380601 containerd[2557]: time="2026-01-28T01:21:26.380474545Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:26.382861 containerd[2557]: time="2026-01-28T01:21:26.382798317Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:21:26.382989 containerd[2557]: time="2026-01-28T01:21:26.382812919Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:26.383179 kubelet[4038]: E0128 01:21:26.383150 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:26.383475 kubelet[4038]: E0128 01:21:26.383194 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:26.383719 kubelet[4038]: E0128 01:21:26.383678 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rjlfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-bg2b5_calico-apiserver(4b861133-0274-4274-bab9-748410e42edc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:26.385059 kubelet[4038]: E0128 01:21:26.384994 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:21:26.740578 containerd[2557]: time="2026-01-28T01:21:26.740539307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-85rm2,Uid:489aa6ff-974c-4c0f-ad71-b359b70146bf,Namespace:calico-apiserver,Attempt:0,}" Jan 28 01:21:26.823027 systemd-networkd[2191]: caliabfd977ac3d: Link UP Jan 28 01:21:26.824084 systemd-networkd[2191]: caliabfd977ac3d: Gained carrier Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.775 [INFO][5649] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0 calico-apiserver-db6789d8- calico-apiserver 489aa6ff-974c-4c0f-ad71-b359b70146bf 830 0 2026-01-28 01:20:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:db6789d8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c calico-apiserver-db6789d8-85rm2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliabfd977ac3d [] [] }} ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.775 [INFO][5649] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.794 [INFO][5661] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" HandleID="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.794 [INFO][5661] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" HandleID="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5800), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593.0.0-n-84a137a86c", "pod":"calico-apiserver-db6789d8-85rm2", "timestamp":"2026-01-28 01:21:26.794713722 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.794 [INFO][5661] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.794 [INFO][5661] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.794 [INFO][5661] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.798 [INFO][5661] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.801 [INFO][5661] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.804 [INFO][5661] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.805 [INFO][5661] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.807 [INFO][5661] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.807 [INFO][5661] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.808 [INFO][5661] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.811 [INFO][5661] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.817 [INFO][5661] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.4/26] block=192.168.98.0/26 handle="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.817 [INFO][5661] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.4/26] handle="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.817 [INFO][5661] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:26.842114 containerd[2557]: 2026-01-28 01:21:26.817 [INFO][5661] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.4/26] IPv6=[] ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" HandleID="k8s-pod-network.fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" Jan 28 01:21:26.843545 containerd[2557]: 2026-01-28 01:21:26.819 [INFO][5649] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0", GenerateName:"calico-apiserver-db6789d8-", Namespace:"calico-apiserver", SelfLink:"", UID:"489aa6ff-974c-4c0f-ad71-b359b70146bf", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"db6789d8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"calico-apiserver-db6789d8-85rm2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliabfd977ac3d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:26.843545 containerd[2557]: 2026-01-28 01:21:26.820 [INFO][5649] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.4/32] ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" Jan 28 01:21:26.843545 containerd[2557]: 2026-01-28 01:21:26.820 [INFO][5649] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliabfd977ac3d ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" Jan 28 01:21:26.843545 containerd[2557]: 2026-01-28 01:21:26.824 [INFO][5649] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" Jan 28 01:21:26.843545 containerd[2557]: 2026-01-28 01:21:26.824 [INFO][5649] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0", GenerateName:"calico-apiserver-db6789d8-", Namespace:"calico-apiserver", SelfLink:"", UID:"489aa6ff-974c-4c0f-ad71-b359b70146bf", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"db6789d8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f", Pod:"calico-apiserver-db6789d8-85rm2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliabfd977ac3d", MAC:"9e:12:6d:97:95:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:26.843545 containerd[2557]: 2026-01-28 01:21:26.840 [INFO][5649] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" Namespace="calico-apiserver" Pod="calico-apiserver-db6789d8-85rm2" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--apiserver--db6789d8--85rm2-eth0" Jan 28 01:21:26.851000 audit[5676]: NETFILTER_CFG table=filter:130 family=2 entries=51 op=nft_register_chain pid=5676 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:26.851000 audit[5676]: SYSCALL arch=c000003e syscall=46 success=yes exit=27116 a0=3 a1=7fffc80aea70 a2=0 a3=7fffc80aea5c items=0 ppid=5201 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.851000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:26.897357 containerd[2557]: time="2026-01-28T01:21:26.897029682Z" level=info msg="connecting to shim fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f" address="unix:///run/containerd/s/d2891aa0c0455f1e9d864fe150bfa7ce87966d026f1aebda53921f30447e169a" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:26.922223 systemd[1]: Started cri-containerd-fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f.scope - libcontainer container fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f. Jan 28 01:21:26.932000 audit: BPF prog-id=250 op=LOAD Jan 28 01:21:26.932000 audit: BPF prog-id=251 op=LOAD Jan 28 01:21:26.932000 audit[5697]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=5685 pid=5697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664643933306431396232306235353237383665363562316634393564 Jan 28 01:21:26.932000 audit: BPF prog-id=251 op=UNLOAD Jan 28 01:21:26.932000 audit[5697]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5685 pid=5697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664643933306431396232306235353237383665363562316634393564 Jan 28 01:21:26.933000 audit: BPF prog-id=252 op=LOAD Jan 28 01:21:26.933000 audit[5697]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=5685 pid=5697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664643933306431396232306235353237383665363562316634393564 Jan 28 01:21:26.933000 audit: BPF prog-id=253 op=LOAD Jan 28 01:21:26.933000 audit[5697]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=5685 pid=5697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664643933306431396232306235353237383665363562316634393564 Jan 28 01:21:26.933000 audit: BPF prog-id=253 op=UNLOAD Jan 28 01:21:26.933000 audit[5697]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=5685 pid=5697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664643933306431396232306235353237383665363562316634393564 Jan 28 01:21:26.933000 audit: BPF prog-id=252 op=UNLOAD Jan 28 01:21:26.933000 audit[5697]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5685 pid=5697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664643933306431396232306235353237383665363562316634393564 Jan 28 01:21:26.933000 audit: BPF prog-id=254 op=LOAD Jan 28 01:21:26.933000 audit[5697]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=5685 pid=5697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:26.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6664643933306431396232306235353237383665363562316634393564 Jan 28 01:21:26.958326 kubelet[4038]: E0128 01:21:26.958299 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:21:26.981824 containerd[2557]: time="2026-01-28T01:21:26.981738805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-db6789d8-85rm2,Uid:489aa6ff-974c-4c0f-ad71-b359b70146bf,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fdd930d19b20b552786e65b1f495db75d469202c4167cdbeb3fe626d00bde18f\"" Jan 28 01:21:26.986968 containerd[2557]: time="2026-01-28T01:21:26.986127730Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:21:27.003000 audit[5723]: NETFILTER_CFG table=filter:131 family=2 entries=20 op=nft_register_rule pid=5723 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:27.003000 audit[5723]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc6f26df70 a2=0 a3=7ffc6f26df5c items=0 ppid=4145 pid=5723 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.003000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:27.008000 audit[5723]: NETFILTER_CFG table=nat:132 family=2 entries=14 op=nft_register_rule pid=5723 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:27.008000 audit[5723]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc6f26df70 a2=0 a3=0 items=0 ppid=4145 pid=5723 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.008000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:27.021000 audit[5725]: NETFILTER_CFG table=filter:133 family=2 entries=17 op=nft_register_rule pid=5725 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:27.021000 audit[5725]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff8ef030c0 a2=0 a3=7fff8ef030ac items=0 ppid=4145 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.021000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:27.023000 audit[5725]: NETFILTER_CFG table=nat:134 family=2 entries=35 op=nft_register_chain pid=5725 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:27.023000 audit[5725]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7fff8ef030c0 a2=0 a3=7fff8ef030ac items=0 ppid=4145 pid=5725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.023000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:27.119150 systemd-networkd[2191]: caliac919477414: Gained IPv6LL Jan 28 01:21:27.247687 containerd[2557]: time="2026-01-28T01:21:27.247652449Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:27.250510 containerd[2557]: time="2026-01-28T01:21:27.250484990Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:21:27.250635 containerd[2557]: time="2026-01-28T01:21:27.250556606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:27.250700 kubelet[4038]: E0128 01:21:27.250662 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:27.250743 kubelet[4038]: E0128 01:21:27.250713 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:27.250852 kubelet[4038]: E0128 01:21:27.250823 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dkxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-85rm2_calico-apiserver(489aa6ff-974c-4c0f-ad71-b359b70146bf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:27.252176 kubelet[4038]: E0128 01:21:27.252148 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:21:27.503315 systemd-networkd[2191]: califda158e8851: Gained IPv6LL Jan 28 01:21:27.742108 containerd[2557]: time="2026-01-28T01:21:27.741756649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5496f89df7-4vb68,Uid:8b2b158a-081b-4454-a96f-65445d9cadc6,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:27.742108 containerd[2557]: time="2026-01-28T01:21:27.741868167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mb789,Uid:d30a5acc-464e-4af2-9051-87ebe5ea8e81,Namespace:kube-system,Attempt:0,}" Jan 28 01:21:27.861447 systemd-networkd[2191]: cali66231353d7c: Link UP Jan 28 01:21:27.862207 systemd-networkd[2191]: cali66231353d7c: Gained carrier Jan 28 01:21:27.876777 kubelet[4038]: I0128 01:21:27.876154 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-p46qn" podStartSLOduration=41.876138243 podStartE2EDuration="41.876138243s" podCreationTimestamp="2026-01-28 01:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:21:26.99899325 +0000 UTC m=+47.342611912" watchObservedRunningTime="2026-01-28 01:21:27.876138243 +0000 UTC m=+48.219756904" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.792 [INFO][5732] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0 coredns-674b8bbfcf- kube-system d30a5acc-464e-4af2-9051-87ebe5ea8e81 831 0 2026-01-28 01:20:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c coredns-674b8bbfcf-mb789 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali66231353d7c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.793 [INFO][5732] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.824 [INFO][5758] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" HandleID="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Workload="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.824 [INFO][5758] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" HandleID="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Workload="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5030), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593.0.0-n-84a137a86c", "pod":"coredns-674b8bbfcf-mb789", "timestamp":"2026-01-28 01:21:27.824091954 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.824 [INFO][5758] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.824 [INFO][5758] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.824 [INFO][5758] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.830 [INFO][5758] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.832 [INFO][5758] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.835 [INFO][5758] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.836 [INFO][5758] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.842 [INFO][5758] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.842 [INFO][5758] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.843 [INFO][5758] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.849 [INFO][5758] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.855 [INFO][5758] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.5/26] block=192.168.98.0/26 handle="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.855 [INFO][5758] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.5/26] handle="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.855 [INFO][5758] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:27.878396 containerd[2557]: 2026-01-28 01:21:27.855 [INFO][5758] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.5/26] IPv6=[] ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" HandleID="k8s-pod-network.9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Workload="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" Jan 28 01:21:27.879338 containerd[2557]: 2026-01-28 01:21:27.858 [INFO][5732] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d30a5acc-464e-4af2-9051-87ebe5ea8e81", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"coredns-674b8bbfcf-mb789", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66231353d7c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:27.879338 containerd[2557]: 2026-01-28 01:21:27.858 [INFO][5732] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.5/32] ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" Jan 28 01:21:27.879338 containerd[2557]: 2026-01-28 01:21:27.858 [INFO][5732] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66231353d7c ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" Jan 28 01:21:27.879338 containerd[2557]: 2026-01-28 01:21:27.860 [INFO][5732] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" Jan 28 01:21:27.879338 containerd[2557]: 2026-01-28 01:21:27.861 [INFO][5732] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d30a5acc-464e-4af2-9051-87ebe5ea8e81", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c", Pod:"coredns-674b8bbfcf-mb789", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66231353d7c", MAC:"82:93:f7:41:2c:a3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:27.879338 containerd[2557]: 2026-01-28 01:21:27.876 [INFO][5732] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" Namespace="kube-system" Pod="coredns-674b8bbfcf-mb789" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-coredns--674b8bbfcf--mb789-eth0" Jan 28 01:21:27.889000 audit[5782]: NETFILTER_CFG table=filter:135 family=2 entries=40 op=nft_register_chain pid=5782 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:27.889000 audit[5782]: SYSCALL arch=c000003e syscall=46 success=yes exit=20328 a0=3 a1=7ffc72e13010 a2=0 a3=7ffc72e12ffc items=0 ppid=5201 pid=5782 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.889000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:27.913104 containerd[2557]: time="2026-01-28T01:21:27.913058488Z" level=info msg="connecting to shim 9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c" address="unix:///run/containerd/s/494f0d108353b8e45f00500415b282f8762b9e059068738696c0df5f49325058" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:27.934184 systemd[1]: Started cri-containerd-9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c.scope - libcontainer container 9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c. Jan 28 01:21:27.946000 audit: BPF prog-id=255 op=LOAD Jan 28 01:21:27.947000 audit: BPF prog-id=256 op=LOAD Jan 28 01:21:27.947000 audit[5803]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=5792 pid=5803 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965333061353162343335633630363439386436363663363537653434 Jan 28 01:21:27.947000 audit: BPF prog-id=256 op=UNLOAD Jan 28 01:21:27.947000 audit[5803]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5792 pid=5803 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965333061353162343335633630363439386436363663363537653434 Jan 28 01:21:27.947000 audit: BPF prog-id=257 op=LOAD Jan 28 01:21:27.947000 audit[5803]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=5792 pid=5803 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965333061353162343335633630363439386436363663363537653434 Jan 28 01:21:27.947000 audit: BPF prog-id=258 op=LOAD Jan 28 01:21:27.947000 audit[5803]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=5792 pid=5803 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965333061353162343335633630363439386436363663363537653434 Jan 28 01:21:27.947000 audit: BPF prog-id=258 op=UNLOAD Jan 28 01:21:27.947000 audit[5803]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=5792 pid=5803 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965333061353162343335633630363439386436363663363537653434 Jan 28 01:21:27.947000 audit: BPF prog-id=257 op=UNLOAD Jan 28 01:21:27.947000 audit[5803]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5792 pid=5803 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965333061353162343335633630363439386436363663363537653434 Jan 28 01:21:27.947000 audit: BPF prog-id=259 op=LOAD Jan 28 01:21:27.947000 audit[5803]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=5792 pid=5803 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:27.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965333061353162343335633630363439386436363663363537653434 Jan 28 01:21:27.966480 kubelet[4038]: E0128 01:21:27.966447 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:21:27.966812 kubelet[4038]: E0128 01:21:27.966716 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:21:27.984544 systemd-networkd[2191]: cali6f8510443f1: Link UP Jan 28 01:21:27.985799 systemd-networkd[2191]: cali6f8510443f1: Gained carrier Jan 28 01:21:28.005555 containerd[2557]: time="2026-01-28T01:21:28.005510548Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mb789,Uid:d30a5acc-464e-4af2-9051-87ebe5ea8e81,Namespace:kube-system,Attempt:0,} returns sandbox id \"9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c\"" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.800 [INFO][5736] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0 calico-kube-controllers-5496f89df7- calico-system 8b2b158a-081b-4454-a96f-65445d9cadc6 828 0 2026-01-28 01:21:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5496f89df7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c calico-kube-controllers-5496f89df7-4vb68 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6f8510443f1 [] [] }} ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.801 [INFO][5736] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.830 [INFO][5763] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" HandleID="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.830 [INFO][5763] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" HandleID="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c5840), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593.0.0-n-84a137a86c", "pod":"calico-kube-controllers-5496f89df7-4vb68", "timestamp":"2026-01-28 01:21:27.830141042 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.830 [INFO][5763] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.855 [INFO][5763] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.855 [INFO][5763] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.931 [INFO][5763] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.935 [INFO][5763] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.938 [INFO][5763] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.940 [INFO][5763] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.942 [INFO][5763] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.942 [INFO][5763] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.943 [INFO][5763] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6 Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.949 [INFO][5763] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.957 [INFO][5763] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.6/26] block=192.168.98.0/26 handle="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.957 [INFO][5763] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.6/26] handle="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.957 [INFO][5763] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:28.012203 containerd[2557]: 2026-01-28 01:21:27.957 [INFO][5763] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.6/26] IPv6=[] ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" HandleID="k8s-pod-network.60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Workload="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" Jan 28 01:21:28.013653 containerd[2557]: 2026-01-28 01:21:27.962 [INFO][5736] cni-plugin/k8s.go 418: Populated endpoint ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0", GenerateName:"calico-kube-controllers-5496f89df7-", Namespace:"calico-system", SelfLink:"", UID:"8b2b158a-081b-4454-a96f-65445d9cadc6", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 21, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5496f89df7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"calico-kube-controllers-5496f89df7-4vb68", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6f8510443f1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:28.013653 containerd[2557]: 2026-01-28 01:21:27.963 [INFO][5736] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.6/32] ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" Jan 28 01:21:28.013653 containerd[2557]: 2026-01-28 01:21:27.963 [INFO][5736] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f8510443f1 ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" Jan 28 01:21:28.013653 containerd[2557]: 2026-01-28 01:21:27.990 [INFO][5736] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" Jan 28 01:21:28.013653 containerd[2557]: 2026-01-28 01:21:27.991 [INFO][5736] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0", GenerateName:"calico-kube-controllers-5496f89df7-", Namespace:"calico-system", SelfLink:"", UID:"8b2b158a-081b-4454-a96f-65445d9cadc6", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 21, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5496f89df7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6", Pod:"calico-kube-controllers-5496f89df7-4vb68", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6f8510443f1", MAC:"62:a2:d0:01:73:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:28.013653 containerd[2557]: 2026-01-28 01:21:28.009 [INFO][5736] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" Namespace="calico-system" Pod="calico-kube-controllers-5496f89df7-4vb68" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-calico--kube--controllers--5496f89df7--4vb68-eth0" Jan 28 01:21:28.015813 containerd[2557]: time="2026-01-28T01:21:28.015785295Z" level=info msg="CreateContainer within sandbox \"9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 28 01:21:28.035000 audit[5839]: NETFILTER_CFG table=filter:136 family=2 entries=14 op=nft_register_rule pid=5839 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:28.035000 audit[5839]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fffd87cc830 a2=0 a3=7fffd87cc81c items=0 ppid=4145 pid=5839 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.035000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:28.042898 containerd[2557]: time="2026-01-28T01:21:28.042866400Z" level=info msg="Container 96612265ba04d7c7d8980e9399ad1571ce29f47800a6ac0f0581d4382679ba08: CDI devices from CRI Config.CDIDevices: []" Jan 28 01:21:28.043000 audit[5839]: NETFILTER_CFG table=nat:137 family=2 entries=20 op=nft_register_rule pid=5839 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:28.043000 audit[5839]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fffd87cc830 a2=0 a3=7fffd87cc81c items=0 ppid=4145 pid=5839 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.043000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:28.063000 audit[5840]: NETFILTER_CFG table=filter:138 family=2 entries=54 op=nft_register_chain pid=5840 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:28.063000 audit[5840]: SYSCALL arch=c000003e syscall=46 success=yes exit=25976 a0=3 a1=7ffd08eedb80 a2=0 a3=7ffd08eedb6c items=0 ppid=5201 pid=5840 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.063000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:28.072484 containerd[2557]: time="2026-01-28T01:21:28.072447438Z" level=info msg="CreateContainer within sandbox \"9e30a51b435c606498d666c657e44e7d8a5d3a61a03480b8d7aa4e2f9791ad7c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"96612265ba04d7c7d8980e9399ad1571ce29f47800a6ac0f0581d4382679ba08\"" Jan 28 01:21:28.074413 containerd[2557]: time="2026-01-28T01:21:28.074379456Z" level=info msg="StartContainer for \"96612265ba04d7c7d8980e9399ad1571ce29f47800a6ac0f0581d4382679ba08\"" Jan 28 01:21:28.077121 containerd[2557]: time="2026-01-28T01:21:28.077092893Z" level=info msg="connecting to shim 60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6" address="unix:///run/containerd/s/19f314be4fbf6cc2f48ff3e4a7ba1402b134895b4b371e15660a951e5fa019fc" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:28.078284 containerd[2557]: time="2026-01-28T01:21:28.078082064Z" level=info msg="connecting to shim 96612265ba04d7c7d8980e9399ad1571ce29f47800a6ac0f0581d4382679ba08" address="unix:///run/containerd/s/494f0d108353b8e45f00500415b282f8762b9e059068738696c0df5f49325058" protocol=ttrpc version=3 Jan 28 01:21:28.100262 systemd[1]: Started cri-containerd-96612265ba04d7c7d8980e9399ad1571ce29f47800a6ac0f0581d4382679ba08.scope - libcontainer container 96612265ba04d7c7d8980e9399ad1571ce29f47800a6ac0f0581d4382679ba08. Jan 28 01:21:28.109121 systemd[1]: Started cri-containerd-60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6.scope - libcontainer container 60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6. Jan 28 01:21:28.115000 audit: BPF prog-id=260 op=LOAD Jan 28 01:21:28.117000 audit: BPF prog-id=261 op=LOAD Jan 28 01:21:28.117000 audit[5860]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=5792 pid=5860 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936363132323635626130346437633764383938306539333939616431 Jan 28 01:21:28.117000 audit: BPF prog-id=261 op=UNLOAD Jan 28 01:21:28.117000 audit[5860]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5792 pid=5860 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936363132323635626130346437633764383938306539333939616431 Jan 28 01:21:28.118000 audit: BPF prog-id=262 op=LOAD Jan 28 01:21:28.118000 audit[5860]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=5792 pid=5860 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936363132323635626130346437633764383938306539333939616431 Jan 28 01:21:28.118000 audit: BPF prog-id=263 op=LOAD Jan 28 01:21:28.118000 audit[5860]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=5792 pid=5860 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936363132323635626130346437633764383938306539333939616431 Jan 28 01:21:28.118000 audit: BPF prog-id=263 op=UNLOAD Jan 28 01:21:28.118000 audit[5860]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5792 pid=5860 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936363132323635626130346437633764383938306539333939616431 Jan 28 01:21:28.118000 audit: BPF prog-id=262 op=UNLOAD Jan 28 01:21:28.118000 audit[5860]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5792 pid=5860 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936363132323635626130346437633764383938306539333939616431 Jan 28 01:21:28.118000 audit: BPF prog-id=264 op=LOAD Jan 28 01:21:28.118000 audit[5860]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=5792 pid=5860 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936363132323635626130346437633764383938306539333939616431 Jan 28 01:21:28.133000 audit: BPF prog-id=265 op=LOAD Jan 28 01:21:28.133000 audit: BPF prog-id=266 op=LOAD Jan 28 01:21:28.133000 audit[5862]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=5848 pid=5862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.133000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630393831616663656365363638313662383531623463373835613831 Jan 28 01:21:28.134000 audit: BPF prog-id=266 op=UNLOAD Jan 28 01:21:28.134000 audit[5862]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5848 pid=5862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.134000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630393831616663656365363638313662383531623463373835613831 Jan 28 01:21:28.134000 audit: BPF prog-id=267 op=LOAD Jan 28 01:21:28.134000 audit[5862]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=5848 pid=5862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.134000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630393831616663656365363638313662383531623463373835613831 Jan 28 01:21:28.134000 audit: BPF prog-id=268 op=LOAD Jan 28 01:21:28.134000 audit[5862]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=5848 pid=5862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.134000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630393831616663656365363638313662383531623463373835613831 Jan 28 01:21:28.134000 audit: BPF prog-id=268 op=UNLOAD Jan 28 01:21:28.134000 audit[5862]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5848 pid=5862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.134000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630393831616663656365363638313662383531623463373835613831 Jan 28 01:21:28.134000 audit: BPF prog-id=267 op=UNLOAD Jan 28 01:21:28.134000 audit[5862]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5848 pid=5862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.134000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630393831616663656365363638313662383531623463373835613831 Jan 28 01:21:28.134000 audit: BPF prog-id=269 op=LOAD Jan 28 01:21:28.134000 audit[5862]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=5848 pid=5862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.134000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630393831616663656365363638313662383531623463373835613831 Jan 28 01:21:28.144205 containerd[2557]: time="2026-01-28T01:21:28.144167017Z" level=info msg="StartContainer for \"96612265ba04d7c7d8980e9399ad1571ce29f47800a6ac0f0581d4382679ba08\" returns successfully" Jan 28 01:21:28.183864 containerd[2557]: time="2026-01-28T01:21:28.183840597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5496f89df7-4vb68,Uid:8b2b158a-081b-4454-a96f-65445d9cadc6,Namespace:calico-system,Attempt:0,} returns sandbox id \"60981afcece66816b851b4c785a81415379ea73749bd3fe316d190f7c7e58cb6\"" Jan 28 01:21:28.185236 containerd[2557]: time="2026-01-28T01:21:28.185182045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:21:28.207079 systemd-networkd[2191]: caliabfd977ac3d: Gained IPv6LL Jan 28 01:21:28.428893 containerd[2557]: time="2026-01-28T01:21:28.428845046Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:28.431348 containerd[2557]: time="2026-01-28T01:21:28.431312858Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:21:28.431412 containerd[2557]: time="2026-01-28T01:21:28.431387396Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:28.431584 kubelet[4038]: E0128 01:21:28.431552 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:21:28.431635 kubelet[4038]: E0128 01:21:28.431599 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:21:28.431770 kubelet[4038]: E0128 01:21:28.431731 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wb2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5496f89df7-4vb68_calico-system(8b2b158a-081b-4454-a96f-65445d9cadc6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:28.433162 kubelet[4038]: E0128 01:21:28.433127 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:21:28.741246 containerd[2557]: time="2026-01-28T01:21:28.741115289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-kz629,Uid:d68b64e3-e019-4732-8971-c8457279d8f6,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:28.825126 systemd-networkd[2191]: cali0c6b7becdb0: Link UP Jan 28 01:21:28.826282 systemd-networkd[2191]: cali0c6b7becdb0: Gained carrier Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.772 [INFO][5919] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0 goldmane-666569f655- calico-system d68b64e3-e019-4732-8971-c8457279d8f6 832 0 2026-01-28 01:20:59 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c goldmane-666569f655-kz629 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0c6b7becdb0 [] [] }} ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.772 [INFO][5919] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.791 [INFO][5931] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" HandleID="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Workload="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.791 [INFO][5931] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" HandleID="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Workload="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f070), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593.0.0-n-84a137a86c", "pod":"goldmane-666569f655-kz629", "timestamp":"2026-01-28 01:21:28.791210378 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.791 [INFO][5931] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.791 [INFO][5931] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.791 [INFO][5931] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.797 [INFO][5931] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.800 [INFO][5931] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.803 [INFO][5931] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.804 [INFO][5931] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.808 [INFO][5931] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.808 [INFO][5931] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.809 [INFO][5931] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438 Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.814 [INFO][5931] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.821 [INFO][5931] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.7/26] block=192.168.98.0/26 handle="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.821 [INFO][5931] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.7/26] handle="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.821 [INFO][5931] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:28.839269 containerd[2557]: 2026-01-28 01:21:28.821 [INFO][5931] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.7/26] IPv6=[] ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" HandleID="k8s-pod-network.0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Workload="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" Jan 28 01:21:28.840377 containerd[2557]: 2026-01-28 01:21:28.822 [INFO][5919] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"d68b64e3-e019-4732-8971-c8457279d8f6", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"goldmane-666569f655-kz629", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0c6b7becdb0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:28.840377 containerd[2557]: 2026-01-28 01:21:28.822 [INFO][5919] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.7/32] ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" Jan 28 01:21:28.840377 containerd[2557]: 2026-01-28 01:21:28.822 [INFO][5919] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c6b7becdb0 ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" Jan 28 01:21:28.840377 containerd[2557]: 2026-01-28 01:21:28.826 [INFO][5919] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" Jan 28 01:21:28.840377 containerd[2557]: 2026-01-28 01:21:28.827 [INFO][5919] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"d68b64e3-e019-4732-8971-c8457279d8f6", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 20, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438", Pod:"goldmane-666569f655-kz629", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0c6b7becdb0", MAC:"fe:87:2e:cc:f8:64", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:28.840377 containerd[2557]: 2026-01-28 01:21:28.835 [INFO][5919] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" Namespace="calico-system" Pod="goldmane-666569f655-kz629" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-goldmane--666569f655--kz629-eth0" Jan 28 01:21:28.852000 audit[5947]: NETFILTER_CFG table=filter:139 family=2 entries=56 op=nft_register_chain pid=5947 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:28.852000 audit[5947]: SYSCALL arch=c000003e syscall=46 success=yes exit=28712 a0=3 a1=7ffde4f46cf0 a2=0 a3=7ffde4f46cdc items=0 ppid=5201 pid=5947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.852000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:28.876194 containerd[2557]: time="2026-01-28T01:21:28.876101775Z" level=info msg="connecting to shim 0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438" address="unix:///run/containerd/s/c0130cb29c6eb471753c0783759a70cf3bf4228f8bb365fbd712d39e43428050" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:28.894170 systemd[1]: Started cri-containerd-0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438.scope - libcontainer container 0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438. Jan 28 01:21:28.905000 audit: BPF prog-id=270 op=LOAD Jan 28 01:21:28.907000 audit: BPF prog-id=271 op=LOAD Jan 28 01:21:28.907000 audit[5967]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=5956 pid=5967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061316339356265373964383964633962383436643166663234316534 Jan 28 01:21:28.908000 audit: BPF prog-id=271 op=UNLOAD Jan 28 01:21:28.908000 audit[5967]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5956 pid=5967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061316339356265373964383964633962383436643166663234316534 Jan 28 01:21:28.908000 audit: BPF prog-id=272 op=LOAD Jan 28 01:21:28.908000 audit[5967]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=5956 pid=5967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061316339356265373964383964633962383436643166663234316534 Jan 28 01:21:28.908000 audit: BPF prog-id=273 op=LOAD Jan 28 01:21:28.908000 audit[5967]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=5956 pid=5967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061316339356265373964383964633962383436643166663234316534 Jan 28 01:21:28.908000 audit: BPF prog-id=273 op=UNLOAD Jan 28 01:21:28.908000 audit[5967]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5956 pid=5967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061316339356265373964383964633962383436643166663234316534 Jan 28 01:21:28.908000 audit: BPF prog-id=272 op=UNLOAD Jan 28 01:21:28.908000 audit[5967]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5956 pid=5967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061316339356265373964383964633962383436643166663234316534 Jan 28 01:21:28.908000 audit: BPF prog-id=274 op=LOAD Jan 28 01:21:28.908000 audit[5967]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=5956 pid=5967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:28.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061316339356265373964383964633962383436643166663234316534 Jan 28 01:21:28.945286 containerd[2557]: time="2026-01-28T01:21:28.945258613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-kz629,Uid:d68b64e3-e019-4732-8971-c8457279d8f6,Namespace:calico-system,Attempt:0,} returns sandbox id \"0a1c95be79d89dc9b846d1ff241e48ba4e007dbcc88180bd93d1e2706758a438\"" Jan 28 01:21:28.946624 containerd[2557]: time="2026-01-28T01:21:28.946474136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:21:28.972683 kubelet[4038]: E0128 01:21:28.972453 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:21:28.980584 kubelet[4038]: E0128 01:21:28.980562 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:21:28.995274 kubelet[4038]: I0128 01:21:28.995189 4038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-mb789" podStartSLOduration=42.995174723 podStartE2EDuration="42.995174723s" podCreationTimestamp="2026-01-28 01:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 01:21:28.995003655 +0000 UTC m=+49.338622316" watchObservedRunningTime="2026-01-28 01:21:28.995174723 +0000 UTC m=+49.338793386" Jan 28 01:21:29.014000 audit[5995]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=5995 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:29.014000 audit[5995]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc0f641cf0 a2=0 a3=7ffc0f641cdc items=0 ppid=4145 pid=5995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.014000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:29.034000 audit[5995]: NETFILTER_CFG table=nat:141 family=2 entries=56 op=nft_register_chain pid=5995 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:29.034000 audit[5995]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffc0f641cf0 a2=0 a3=7ffc0f641cdc items=0 ppid=4145 pid=5995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.034000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:29.039068 systemd-networkd[2191]: cali6f8510443f1: Gained IPv6LL Jan 28 01:21:29.193631 containerd[2557]: time="2026-01-28T01:21:29.193499927Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:29.196011 containerd[2557]: time="2026-01-28T01:21:29.195973597Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:21:29.196144 containerd[2557]: time="2026-01-28T01:21:29.195971276Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:29.196332 kubelet[4038]: E0128 01:21:29.196304 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:21:29.196387 kubelet[4038]: E0128 01:21:29.196344 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:21:29.196635 kubelet[4038]: E0128 01:21:29.196480 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-kz629_calico-system(d68b64e3-e019-4732-8971-c8457279d8f6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:29.198439 kubelet[4038]: E0128 01:21:29.198412 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:21:29.487251 systemd-networkd[2191]: cali66231353d7c: Gained IPv6LL Jan 28 01:21:29.742834 containerd[2557]: time="2026-01-28T01:21:29.742686662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lcd4c,Uid:74fe3431-17ca-4902-9eb5-64c3701d8bd6,Namespace:calico-system,Attempt:0,}" Jan 28 01:21:29.834826 systemd-networkd[2191]: cali57b3c032fdd: Link UP Jan 28 01:21:29.835907 systemd-networkd[2191]: cali57b3c032fdd: Gained carrier Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.777 [INFO][5998] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0 csi-node-driver- calico-system 74fe3431-17ca-4902-9eb5-64c3701d8bd6 707 0 2026-01-28 01:21:01 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4593.0.0-n-84a137a86c csi-node-driver-lcd4c eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali57b3c032fdd [] [] }} ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.777 [INFO][5998] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.800 [INFO][6010] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" HandleID="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Workload="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.800 [INFO][6010] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" HandleID="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Workload="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f220), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593.0.0-n-84a137a86c", "pod":"csi-node-driver-lcd4c", "timestamp":"2026-01-28 01:21:29.800214731 +0000 UTC"}, Hostname:"ci-4593.0.0-n-84a137a86c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.800 [INFO][6010] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.800 [INFO][6010] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.800 [INFO][6010] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593.0.0-n-84a137a86c' Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.805 [INFO][6010] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.808 [INFO][6010] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.812 [INFO][6010] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.814 [INFO][6010] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.816 [INFO][6010] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.816 [INFO][6010] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.818 [INFO][6010] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4 Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.822 [INFO][6010] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.829 [INFO][6010] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.8/26] block=192.168.98.0/26 handle="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.829 [INFO][6010] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.8/26] handle="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" host="ci-4593.0.0-n-84a137a86c" Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.829 [INFO][6010] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 01:21:29.849149 containerd[2557]: 2026-01-28 01:21:29.829 [INFO][6010] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.8/26] IPv6=[] ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" HandleID="k8s-pod-network.5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Workload="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" Jan 28 01:21:29.850933 containerd[2557]: 2026-01-28 01:21:29.831 [INFO][5998] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"74fe3431-17ca-4902-9eb5-64c3701d8bd6", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 21, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"", Pod:"csi-node-driver-lcd4c", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali57b3c032fdd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:29.850933 containerd[2557]: 2026-01-28 01:21:29.831 [INFO][5998] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.8/32] ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" Jan 28 01:21:29.850933 containerd[2557]: 2026-01-28 01:21:29.831 [INFO][5998] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali57b3c032fdd ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" Jan 28 01:21:29.850933 containerd[2557]: 2026-01-28 01:21:29.835 [INFO][5998] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" Jan 28 01:21:29.850933 containerd[2557]: 2026-01-28 01:21:29.836 [INFO][5998] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"74fe3431-17ca-4902-9eb5-64c3701d8bd6", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 1, 21, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593.0.0-n-84a137a86c", ContainerID:"5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4", Pod:"csi-node-driver-lcd4c", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali57b3c032fdd", MAC:"82:9c:db:5e:fe:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 01:21:29.850933 containerd[2557]: 2026-01-28 01:21:29.846 [INFO][5998] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" Namespace="calico-system" Pod="csi-node-driver-lcd4c" WorkloadEndpoint="ci--4593.0.0--n--84a137a86c-k8s-csi--node--driver--lcd4c-eth0" Jan 28 01:21:29.864000 audit[6024]: NETFILTER_CFG table=filter:142 family=2 entries=52 op=nft_register_chain pid=6024 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 01:21:29.864000 audit[6024]: SYSCALL arch=c000003e syscall=46 success=yes exit=24296 a0=3 a1=7ffdb0e15ed0 a2=0 a3=7ffdb0e15ebc items=0 ppid=5201 pid=6024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.864000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 01:21:29.900021 containerd[2557]: time="2026-01-28T01:21:29.899633366Z" level=info msg="connecting to shim 5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4" address="unix:///run/containerd/s/bad904c2caf67309f8e7293823bbbf5470e0c17455ddaad2c05556436bb9a9cd" namespace=k8s.io protocol=ttrpc version=3 Jan 28 01:21:29.929110 systemd[1]: Started cri-containerd-5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4.scope - libcontainer container 5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4. Jan 28 01:21:29.936000 audit: BPF prog-id=275 op=LOAD Jan 28 01:21:29.936000 audit: BPF prog-id=276 op=LOAD Jan 28 01:21:29.936000 audit[6043]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=6032 pid=6043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537333061336338313766613931383563333330313031346333616362 Jan 28 01:21:29.936000 audit: BPF prog-id=276 op=UNLOAD Jan 28 01:21:29.936000 audit[6043]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=6032 pid=6043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537333061336338313766613931383563333330313031346333616362 Jan 28 01:21:29.936000 audit: BPF prog-id=277 op=LOAD Jan 28 01:21:29.936000 audit[6043]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=6032 pid=6043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537333061336338313766613931383563333330313031346333616362 Jan 28 01:21:29.936000 audit: BPF prog-id=278 op=LOAD Jan 28 01:21:29.936000 audit[6043]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=6032 pid=6043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537333061336338313766613931383563333330313031346333616362 Jan 28 01:21:29.936000 audit: BPF prog-id=278 op=UNLOAD Jan 28 01:21:29.936000 audit[6043]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=6032 pid=6043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537333061336338313766613931383563333330313031346333616362 Jan 28 01:21:29.936000 audit: BPF prog-id=277 op=UNLOAD Jan 28 01:21:29.936000 audit[6043]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=6032 pid=6043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537333061336338313766613931383563333330313031346333616362 Jan 28 01:21:29.936000 audit: BPF prog-id=279 op=LOAD Jan 28 01:21:29.936000 audit[6043]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=6032 pid=6043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:29.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537333061336338313766613931383563333330313031346333616362 Jan 28 01:21:29.953906 containerd[2557]: time="2026-01-28T01:21:29.953877289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lcd4c,Uid:74fe3431-17ca-4902-9eb5-64c3701d8bd6,Namespace:calico-system,Attempt:0,} returns sandbox id \"5730a3c817fa9185c3301014c3acbc5108a8cf7d8af4a75d460eff6b735687c4\"" Jan 28 01:21:29.955371 containerd[2557]: time="2026-01-28T01:21:29.955347558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:21:29.982447 kubelet[4038]: E0128 01:21:29.982420 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:21:29.983400 kubelet[4038]: E0128 01:21:29.982803 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:21:30.012000 audit[6072]: NETFILTER_CFG table=filter:143 family=2 entries=14 op=nft_register_rule pid=6072 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:30.012000 audit[6072]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe7ff2ca20 a2=0 a3=7ffe7ff2ca0c items=0 ppid=4145 pid=6072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:30.012000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:30.019000 audit[6072]: NETFILTER_CFG table=nat:144 family=2 entries=20 op=nft_register_rule pid=6072 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:21:30.019000 audit[6072]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffe7ff2ca20 a2=0 a3=7ffe7ff2ca0c items=0 ppid=4145 pid=6072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:21:30.019000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:21:30.205523 containerd[2557]: time="2026-01-28T01:21:30.205449105Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:30.208248 containerd[2557]: time="2026-01-28T01:21:30.208139694Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:21:30.208248 containerd[2557]: time="2026-01-28T01:21:30.208224654Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:30.209182 kubelet[4038]: E0128 01:21:30.208465 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:21:30.209322 kubelet[4038]: E0128 01:21:30.209292 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:21:30.209683 kubelet[4038]: E0128 01:21:30.209647 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:30.212219 containerd[2557]: time="2026-01-28T01:21:30.212187496Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:21:30.319115 systemd-networkd[2191]: cali0c6b7becdb0: Gained IPv6LL Jan 28 01:21:30.443728 containerd[2557]: time="2026-01-28T01:21:30.443607500Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:30.446438 containerd[2557]: time="2026-01-28T01:21:30.446398980Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:21:30.446515 containerd[2557]: time="2026-01-28T01:21:30.446491669Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:30.446711 kubelet[4038]: E0128 01:21:30.446675 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:21:30.446765 kubelet[4038]: E0128 01:21:30.446744 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:21:30.446923 kubelet[4038]: E0128 01:21:30.446885 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:30.448859 kubelet[4038]: E0128 01:21:30.448825 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:30.983736 kubelet[4038]: E0128 01:21:30.983634 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:31.343105 systemd-networkd[2191]: cali57b3c032fdd: Gained IPv6LL Jan 28 01:21:36.742241 containerd[2557]: time="2026-01-28T01:21:36.742194555Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:21:36.993805 containerd[2557]: time="2026-01-28T01:21:36.993674891Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:36.996775 containerd[2557]: time="2026-01-28T01:21:36.996708219Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:21:36.997110 containerd[2557]: time="2026-01-28T01:21:36.996764016Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:36.997274 kubelet[4038]: E0128 01:21:36.997177 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:21:36.997860 kubelet[4038]: E0128 01:21:36.997322 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:21:36.997860 kubelet[4038]: E0128 01:21:36.997627 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0c0e29983b554c87af1b31cc149295e5,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:36.999679 containerd[2557]: time="2026-01-28T01:21:36.999621665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:21:37.245335 containerd[2557]: time="2026-01-28T01:21:37.245018151Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:37.247700 containerd[2557]: time="2026-01-28T01:21:37.247673975Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:21:37.247782 containerd[2557]: time="2026-01-28T01:21:37.247726125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:37.247858 kubelet[4038]: E0128 01:21:37.247825 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:21:37.247910 kubelet[4038]: E0128 01:21:37.247870 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:21:37.248095 kubelet[4038]: E0128 01:21:37.248032 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:37.249597 kubelet[4038]: E0128 01:21:37.249554 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:21:41.741111 containerd[2557]: time="2026-01-28T01:21:41.741069680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:21:42.001420 containerd[2557]: time="2026-01-28T01:21:42.001296185Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:42.004090 containerd[2557]: time="2026-01-28T01:21:42.004035236Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:21:42.004716 containerd[2557]: time="2026-01-28T01:21:42.004112177Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:42.004759 kubelet[4038]: E0128 01:21:42.004190 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:42.004759 kubelet[4038]: E0128 01:21:42.004227 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:42.004759 kubelet[4038]: E0128 01:21:42.004353 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dkxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-85rm2_calico-apiserver(489aa6ff-974c-4c0f-ad71-b359b70146bf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:42.006415 kubelet[4038]: E0128 01:21:42.006353 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:21:42.742326 containerd[2557]: time="2026-01-28T01:21:42.742245187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:21:43.004392 containerd[2557]: time="2026-01-28T01:21:43.004267271Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:43.006865 containerd[2557]: time="2026-01-28T01:21:43.006823911Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:21:43.006988 containerd[2557]: time="2026-01-28T01:21:43.006885741Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:43.007015 kubelet[4038]: E0128 01:21:43.006983 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:21:43.007308 kubelet[4038]: E0128 01:21:43.007017 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:21:43.008678 kubelet[4038]: E0128 01:21:43.008634 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:43.010849 containerd[2557]: time="2026-01-28T01:21:43.010665457Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:21:43.279640 containerd[2557]: time="2026-01-28T01:21:43.279532487Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:43.282034 containerd[2557]: time="2026-01-28T01:21:43.281987129Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:21:43.282154 containerd[2557]: time="2026-01-28T01:21:43.282066069Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:43.282223 kubelet[4038]: E0128 01:21:43.282183 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:21:43.282269 kubelet[4038]: E0128 01:21:43.282238 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:21:43.282664 kubelet[4038]: E0128 01:21:43.282365 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:43.284323 kubelet[4038]: E0128 01:21:43.283971 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:43.742348 containerd[2557]: time="2026-01-28T01:21:43.742124514Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:21:43.993070 containerd[2557]: time="2026-01-28T01:21:43.992936828Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:43.995792 containerd[2557]: time="2026-01-28T01:21:43.995673478Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:21:43.995792 containerd[2557]: time="2026-01-28T01:21:43.995765473Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:43.996118 kubelet[4038]: E0128 01:21:43.996032 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:43.996118 kubelet[4038]: E0128 01:21:43.996097 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:21:43.997176 kubelet[4038]: E0128 01:21:43.996609 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rjlfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-bg2b5_calico-apiserver(4b861133-0274-4274-bab9-748410e42edc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:43.997334 containerd[2557]: time="2026-01-28T01:21:43.996448499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:21:43.997761 kubelet[4038]: E0128 01:21:43.997734 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:21:44.245370 containerd[2557]: time="2026-01-28T01:21:44.245260939Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:44.251386 containerd[2557]: time="2026-01-28T01:21:44.251357395Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:21:44.251461 containerd[2557]: time="2026-01-28T01:21:44.251433530Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:44.251591 kubelet[4038]: E0128 01:21:44.251558 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:21:44.251839 kubelet[4038]: E0128 01:21:44.251603 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:21:44.251880 kubelet[4038]: E0128 01:21:44.251828 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wb2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5496f89df7-4vb68_calico-system(8b2b158a-081b-4454-a96f-65445d9cadc6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:44.252601 containerd[2557]: time="2026-01-28T01:21:44.252576960Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:21:44.253943 kubelet[4038]: E0128 01:21:44.253900 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:21:44.490580 containerd[2557]: time="2026-01-28T01:21:44.490533127Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:21:44.493137 containerd[2557]: time="2026-01-28T01:21:44.493107793Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:21:44.493198 containerd[2557]: time="2026-01-28T01:21:44.493177277Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:21:44.493326 kubelet[4038]: E0128 01:21:44.493285 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:21:44.493386 kubelet[4038]: E0128 01:21:44.493340 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:21:44.493525 kubelet[4038]: E0128 01:21:44.493479 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-kz629_calico-system(d68b64e3-e019-4732-8971-c8457279d8f6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:21:44.495032 kubelet[4038]: E0128 01:21:44.494992 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:21:49.743289 kubelet[4038]: E0128 01:21:49.743199 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:21:53.742675 kubelet[4038]: E0128 01:21:53.742622 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:21:54.741302 kubelet[4038]: E0128 01:21:54.741138 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:21:54.741999 kubelet[4038]: E0128 01:21:54.741933 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:21:55.742733 kubelet[4038]: E0128 01:21:55.742694 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:21:56.741558 kubelet[4038]: E0128 01:21:56.741489 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:22:00.743851 containerd[2557]: time="2026-01-28T01:22:00.743105978Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:22:00.995049 containerd[2557]: time="2026-01-28T01:22:00.993615275Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:00.996408 containerd[2557]: time="2026-01-28T01:22:00.996293932Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:22:00.996408 containerd[2557]: time="2026-01-28T01:22:00.996385969Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:00.996975 kubelet[4038]: E0128 01:22:00.996666 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:22:00.996975 kubelet[4038]: E0128 01:22:00.996716 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:22:00.996975 kubelet[4038]: E0128 01:22:00.996829 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0c0e29983b554c87af1b31cc149295e5,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:01.000393 containerd[2557]: time="2026-01-28T01:22:01.000370274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:22:01.241047 containerd[2557]: time="2026-01-28T01:22:01.241003585Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:01.258895 containerd[2557]: time="2026-01-28T01:22:01.258796840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:01.258895 containerd[2557]: time="2026-01-28T01:22:01.258863994Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:22:01.259354 kubelet[4038]: E0128 01:22:01.259321 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:22:01.259430 kubelet[4038]: E0128 01:22:01.259366 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:22:01.259541 kubelet[4038]: E0128 01:22:01.259490 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:01.260802 kubelet[4038]: E0128 01:22:01.260718 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:22:05.743754 containerd[2557]: time="2026-01-28T01:22:05.743518984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:22:05.983619 containerd[2557]: time="2026-01-28T01:22:05.983579445Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:05.985960 containerd[2557]: time="2026-01-28T01:22:05.985896119Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:22:05.985960 containerd[2557]: time="2026-01-28T01:22:05.985933583Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:05.986180 kubelet[4038]: E0128 01:22:05.986144 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:22:05.986562 kubelet[4038]: E0128 01:22:05.986193 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:22:05.986562 kubelet[4038]: E0128 01:22:05.986347 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-kz629_calico-system(d68b64e3-e019-4732-8971-c8457279d8f6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:05.987584 kubelet[4038]: E0128 01:22:05.987530 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:22:06.741826 containerd[2557]: time="2026-01-28T01:22:06.741780506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:22:06.978226 containerd[2557]: time="2026-01-28T01:22:06.978071947Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:06.980726 containerd[2557]: time="2026-01-28T01:22:06.980605353Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:22:06.980726 containerd[2557]: time="2026-01-28T01:22:06.980699253Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:06.982069 kubelet[4038]: E0128 01:22:06.981023 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:22:06.982069 kubelet[4038]: E0128 01:22:06.982038 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:22:06.982331 kubelet[4038]: E0128 01:22:06.982254 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:06.985934 containerd[2557]: time="2026-01-28T01:22:06.985896687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:22:07.240275 containerd[2557]: time="2026-01-28T01:22:07.240229318Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:07.242792 containerd[2557]: time="2026-01-28T01:22:07.242748739Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:22:07.242897 containerd[2557]: time="2026-01-28T01:22:07.242843880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:07.243246 kubelet[4038]: E0128 01:22:07.243213 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:22:07.243538 kubelet[4038]: E0128 01:22:07.243261 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:22:07.243538 kubelet[4038]: E0128 01:22:07.243388 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:07.244834 kubelet[4038]: E0128 01:22:07.244793 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:22:07.742420 containerd[2557]: time="2026-01-28T01:22:07.742379403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:22:08.029203 containerd[2557]: time="2026-01-28T01:22:08.028940695Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:08.031324 containerd[2557]: time="2026-01-28T01:22:08.031287831Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:22:08.031385 containerd[2557]: time="2026-01-28T01:22:08.031367508Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:08.031508 kubelet[4038]: E0128 01:22:08.031479 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:22:08.031549 kubelet[4038]: E0128 01:22:08.031521 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:22:08.031678 kubelet[4038]: E0128 01:22:08.031651 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dkxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-85rm2_calico-apiserver(489aa6ff-974c-4c0f-ad71-b359b70146bf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:08.032945 kubelet[4038]: E0128 01:22:08.032900 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:22:08.744326 containerd[2557]: time="2026-01-28T01:22:08.744192248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:22:08.985049 containerd[2557]: time="2026-01-28T01:22:08.985005977Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:08.987296 containerd[2557]: time="2026-01-28T01:22:08.987261798Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:22:08.987382 containerd[2557]: time="2026-01-28T01:22:08.987342748Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:08.987594 kubelet[4038]: E0128 01:22:08.987529 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:22:08.987594 kubelet[4038]: E0128 01:22:08.987572 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:22:08.988284 kubelet[4038]: E0128 01:22:08.988226 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wb2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5496f89df7-4vb68_calico-system(8b2b158a-081b-4454-a96f-65445d9cadc6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:08.989624 kubelet[4038]: E0128 01:22:08.989579 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:22:10.743599 containerd[2557]: time="2026-01-28T01:22:10.743014409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:22:11.004450 containerd[2557]: time="2026-01-28T01:22:11.004317168Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:11.006653 containerd[2557]: time="2026-01-28T01:22:11.006620001Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:22:11.006720 containerd[2557]: time="2026-01-28T01:22:11.006699068Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:11.006885 kubelet[4038]: E0128 01:22:11.006844 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:22:11.007313 kubelet[4038]: E0128 01:22:11.006901 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:22:11.007313 kubelet[4038]: E0128 01:22:11.007067 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rjlfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-bg2b5_calico-apiserver(4b861133-0274-4274-bab9-748410e42edc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:11.008686 kubelet[4038]: E0128 01:22:11.008633 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:22:14.741424 kubelet[4038]: E0128 01:22:14.741340 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:22:19.744983 kubelet[4038]: E0128 01:22:19.744733 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:22:20.742987 kubelet[4038]: E0128 01:22:20.742257 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:22:21.744788 kubelet[4038]: E0128 01:22:21.744687 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:22:21.748025 kubelet[4038]: E0128 01:22:21.747995 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:22:25.743971 kubelet[4038]: E0128 01:22:25.743765 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:22:25.746417 kubelet[4038]: E0128 01:22:25.746375 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:22:27.446000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.200.8.20:22-10.200.16.10:33652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:27.448284 kernel: kauditd_printk_skb: 239 callbacks suppressed Jan 28 01:22:27.448327 kernel: audit: type=1130 audit(1769563347.446:764): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.200.8.20:22-10.200.16.10:33652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:27.447242 systemd[1]: Started sshd@7-10.200.8.20:22-10.200.16.10:33652.service - OpenSSH per-connection server daemon (10.200.16.10:33652). Jan 28 01:22:27.985000 audit[6167]: USER_ACCT pid=6167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:27.991966 kernel: audit: type=1101 audit(1769563347.985:765): pid=6167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:27.992236 sshd[6167]: Accepted publickey for core from 10.200.16.10 port 33652 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:22:27.994085 sshd-session[6167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:22:27.992000 audit[6167]: CRED_ACQ pid=6167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.002459 kernel: audit: type=1103 audit(1769563347.992:766): pid=6167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.002518 kernel: audit: type=1006 audit(1769563347.992:767): pid=6167 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 28 01:22:28.008055 kernel: audit: type=1300 audit(1769563347.992:767): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd9abfcf50 a2=3 a3=0 items=0 ppid=1 pid=6167 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:27.992000 audit[6167]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd9abfcf50 a2=3 a3=0 items=0 ppid=1 pid=6167 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:27.992000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:28.010111 kernel: audit: type=1327 audit(1769563347.992:767): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:28.012841 systemd-logind[2536]: New session 11 of user core. Jan 28 01:22:28.018129 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 28 01:22:28.020000 audit[6167]: USER_START pid=6167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.027403 kernel: audit: type=1105 audit(1769563348.020:768): pid=6167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.026000 audit[6171]: CRED_ACQ pid=6171 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.035019 kernel: audit: type=1103 audit(1769563348.026:769): pid=6171 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.363046 sshd[6171]: Connection closed by 10.200.16.10 port 33652 Jan 28 01:22:28.363508 sshd-session[6167]: pam_unix(sshd:session): session closed for user core Jan 28 01:22:28.364000 audit[6167]: USER_END pid=6167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.367757 systemd[1]: sshd@7-10.200.8.20:22-10.200.16.10:33652.service: Deactivated successfully. Jan 28 01:22:28.370426 systemd[1]: session-11.scope: Deactivated successfully. Jan 28 01:22:28.372980 kernel: audit: type=1106 audit(1769563348.364:770): pid=6167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.364000 audit[6167]: CRED_DISP pid=6167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:28.373902 systemd-logind[2536]: Session 11 logged out. Waiting for processes to exit. Jan 28 01:22:28.376002 systemd-logind[2536]: Removed session 11. Jan 28 01:22:28.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.200.8.20:22-10.200.16.10:33652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:28.381005 kernel: audit: type=1104 audit(1769563348.364:771): pid=6167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:30.741098 kubelet[4038]: E0128 01:22:30.741033 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:22:33.484996 systemd[1]: Started sshd@8-10.200.8.20:22-10.200.16.10:47200.service - OpenSSH per-connection server daemon (10.200.16.10:47200). Jan 28 01:22:33.486597 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:22:33.486661 kernel: audit: type=1130 audit(1769563353.484:773): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.200.8.20:22-10.200.16.10:47200 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:33.484000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.200.8.20:22-10.200.16.10:47200 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:33.742984 kubelet[4038]: E0128 01:22:33.742588 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:22:34.023000 audit[6185]: USER_ACCT pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.028601 sshd-session[6185]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:22:34.029574 sshd[6185]: Accepted publickey for core from 10.200.16.10 port 47200 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:22:34.030026 kernel: audit: type=1101 audit(1769563354.023:774): pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.026000 audit[6185]: CRED_ACQ pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.037099 kernel: audit: type=1103 audit(1769563354.026:775): pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.037205 kernel: audit: type=1006 audit(1769563354.026:776): pid=6185 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 28 01:22:34.042130 kernel: audit: type=1300 audit(1769563354.026:776): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe6768c140 a2=3 a3=0 items=0 ppid=1 pid=6185 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:34.026000 audit[6185]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe6768c140 a2=3 a3=0 items=0 ppid=1 pid=6185 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:34.043881 kernel: audit: type=1327 audit(1769563354.026:776): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:34.026000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:34.044026 systemd-logind[2536]: New session 12 of user core. Jan 28 01:22:34.051698 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 28 01:22:34.053000 audit[6185]: USER_START pid=6185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.068979 kernel: audit: type=1105 audit(1769563354.053:777): pid=6185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.053000 audit[6189]: CRED_ACQ pid=6189 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.075978 kernel: audit: type=1103 audit(1769563354.053:778): pid=6189 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.392048 sshd[6189]: Connection closed by 10.200.16.10 port 47200 Jan 28 01:22:34.393181 sshd-session[6185]: pam_unix(sshd:session): session closed for user core Jan 28 01:22:34.394000 audit[6185]: USER_END pid=6185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.400489 systemd-logind[2536]: Session 12 logged out. Waiting for processes to exit. Jan 28 01:22:34.401701 systemd[1]: sshd@8-10.200.8.20:22-10.200.16.10:47200.service: Deactivated successfully. Jan 28 01:22:34.402197 kernel: audit: type=1106 audit(1769563354.394:779): pid=6185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.405273 systemd[1]: session-12.scope: Deactivated successfully. Jan 28 01:22:34.394000 audit[6185]: CRED_DISP pid=6185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.410021 kernel: audit: type=1104 audit(1769563354.394:780): pid=6185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:34.411628 systemd-logind[2536]: Removed session 12. Jan 28 01:22:34.401000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.200.8.20:22-10.200.16.10:47200 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:34.741971 kubelet[4038]: E0128 01:22:34.741757 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:22:35.747186 kubelet[4038]: E0128 01:22:35.745844 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:22:37.744773 kubelet[4038]: E0128 01:22:37.744452 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:22:38.741891 kubelet[4038]: E0128 01:22:38.741352 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:22:39.512288 systemd[1]: Started sshd@9-10.200.8.20:22-10.200.16.10:47214.service - OpenSSH per-connection server daemon (10.200.16.10:47214). Jan 28 01:22:39.519884 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:22:39.519917 kernel: audit: type=1130 audit(1769563359.511:782): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.200.8.20:22-10.200.16.10:47214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:39.511000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.200.8.20:22-10.200.16.10:47214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:40.062000 audit[6202]: USER_ACCT pid=6202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.066726 sshd-session[6202]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:22:40.067977 kernel: audit: type=1101 audit(1769563360.062:783): pid=6202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.068010 sshd[6202]: Accepted publickey for core from 10.200.16.10 port 47214 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:22:40.073266 kernel: audit: type=1103 audit(1769563360.064:784): pid=6202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.064000 audit[6202]: CRED_ACQ pid=6202 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.075986 kernel: audit: type=1006 audit(1769563360.065:785): pid=6202 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 28 01:22:40.065000 audit[6202]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd63288d00 a2=3 a3=0 items=0 ppid=1 pid=6202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:40.082851 kernel: audit: type=1300 audit(1769563360.065:785): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd63288d00 a2=3 a3=0 items=0 ppid=1 pid=6202 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:40.082920 kernel: audit: type=1327 audit(1769563360.065:785): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:40.065000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:40.086678 systemd-logind[2536]: New session 13 of user core. Jan 28 01:22:40.093292 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 28 01:22:40.096000 audit[6202]: USER_START pid=6202 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.100000 audit[6208]: CRED_ACQ pid=6208 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.107496 kernel: audit: type=1105 audit(1769563360.096:786): pid=6202 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.107542 kernel: audit: type=1103 audit(1769563360.100:787): pid=6208 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.420047 sshd[6208]: Connection closed by 10.200.16.10 port 47214 Jan 28 01:22:40.420179 sshd-session[6202]: pam_unix(sshd:session): session closed for user core Jan 28 01:22:40.420000 audit[6202]: USER_END pid=6202 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.423821 systemd[1]: sshd@9-10.200.8.20:22-10.200.16.10:47214.service: Deactivated successfully. Jan 28 01:22:40.426051 systemd[1]: session-13.scope: Deactivated successfully. Jan 28 01:22:40.429123 systemd-logind[2536]: Session 13 logged out. Waiting for processes to exit. Jan 28 01:22:40.430015 kernel: audit: type=1106 audit(1769563360.420:788): pid=6202 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.421000 audit[6202]: CRED_DISP pid=6202 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.430515 systemd-logind[2536]: Removed session 13. Jan 28 01:22:40.423000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.200.8.20:22-10.200.16.10:47214 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:40.435005 kernel: audit: type=1104 audit(1769563360.421:789): pid=6202 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:40.545197 systemd[1]: Started sshd@10-10.200.8.20:22-10.200.16.10:45762.service - OpenSSH per-connection server daemon (10.200.16.10:45762). Jan 28 01:22:40.544000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.200.8.20:22-10.200.16.10:45762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:41.094000 audit[6221]: USER_ACCT pid=6221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:41.095590 sshd[6221]: Accepted publickey for core from 10.200.16.10 port 45762 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:22:41.096000 audit[6221]: CRED_ACQ pid=6221 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:41.096000 audit[6221]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea362c430 a2=3 a3=0 items=0 ppid=1 pid=6221 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:41.096000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:41.098814 sshd-session[6221]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:22:41.106565 systemd-logind[2536]: New session 14 of user core. Jan 28 01:22:41.110273 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 28 01:22:41.112000 audit[6221]: USER_START pid=6221 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:41.114000 audit[6225]: CRED_ACQ pid=6225 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:41.466220 sshd[6225]: Connection closed by 10.200.16.10 port 45762 Jan 28 01:22:41.467123 sshd-session[6221]: pam_unix(sshd:session): session closed for user core Jan 28 01:22:41.467000 audit[6221]: USER_END pid=6221 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:41.467000 audit[6221]: CRED_DISP pid=6221 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:41.470297 systemd[1]: sshd@10-10.200.8.20:22-10.200.16.10:45762.service: Deactivated successfully. Jan 28 01:22:41.469000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.200.8.20:22-10.200.16.10:45762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:41.472242 systemd[1]: session-14.scope: Deactivated successfully. Jan 28 01:22:41.474033 systemd-logind[2536]: Session 14 logged out. Waiting for processes to exit. Jan 28 01:22:41.475009 systemd-logind[2536]: Removed session 14. Jan 28 01:22:41.576678 systemd[1]: Started sshd@11-10.200.8.20:22-10.200.16.10:45774.service - OpenSSH per-connection server daemon (10.200.16.10:45774). Jan 28 01:22:41.575000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.200.8.20:22-10.200.16.10:45774 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:42.109000 audit[6235]: USER_ACCT pid=6235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:42.110512 sshd[6235]: Accepted publickey for core from 10.200.16.10 port 45774 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:22:42.110000 audit[6235]: CRED_ACQ pid=6235 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:42.110000 audit[6235]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc26052080 a2=3 a3=0 items=0 ppid=1 pid=6235 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:42.110000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:42.111932 sshd-session[6235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:22:42.116169 systemd-logind[2536]: New session 15 of user core. Jan 28 01:22:42.118144 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 28 01:22:42.119000 audit[6235]: USER_START pid=6235 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:42.121000 audit[6239]: CRED_ACQ pid=6239 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:42.494406 sshd[6239]: Connection closed by 10.200.16.10 port 45774 Jan 28 01:22:42.495112 sshd-session[6235]: pam_unix(sshd:session): session closed for user core Jan 28 01:22:42.497000 audit[6235]: USER_END pid=6235 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:42.497000 audit[6235]: CRED_DISP pid=6235 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:42.500000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.200.8.20:22-10.200.16.10:45774 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:42.500874 systemd-logind[2536]: Session 15 logged out. Waiting for processes to exit. Jan 28 01:22:42.501521 systemd[1]: sshd@11-10.200.8.20:22-10.200.16.10:45774.service: Deactivated successfully. Jan 28 01:22:42.505628 systemd[1]: session-15.scope: Deactivated successfully. Jan 28 01:22:42.508744 systemd-logind[2536]: Removed session 15. Jan 28 01:22:42.741655 kubelet[4038]: E0128 01:22:42.741572 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:22:45.744298 kubelet[4038]: E0128 01:22:45.744198 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:22:47.619923 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 28 01:22:47.620053 kernel: audit: type=1130 audit(1769563367.613:809): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.200.8.20:22-10.200.16.10:45788 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:47.613000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.200.8.20:22-10.200.16.10:45788 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:47.614054 systemd[1]: Started sshd@12-10.200.8.20:22-10.200.16.10:45788.service - OpenSSH per-connection server daemon (10.200.16.10:45788). Jan 28 01:22:48.154000 audit[6264]: USER_ACCT pid=6264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.159975 kernel: audit: type=1101 audit(1769563368.154:810): pid=6264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.159033 sshd-session[6264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:22:48.160311 sshd[6264]: Accepted publickey for core from 10.200.16.10 port 45788 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:22:48.157000 audit[6264]: CRED_ACQ pid=6264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.165086 kernel: audit: type=1103 audit(1769563368.157:811): pid=6264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.172721 kernel: audit: type=1006 audit(1769563368.157:812): pid=6264 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 28 01:22:48.172774 kernel: audit: type=1300 audit(1769563368.157:812): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffa9df29e0 a2=3 a3=0 items=0 ppid=1 pid=6264 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:48.157000 audit[6264]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffa9df29e0 a2=3 a3=0 items=0 ppid=1 pid=6264 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:48.168845 systemd-logind[2536]: New session 16 of user core. Jan 28 01:22:48.157000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:48.175969 kernel: audit: type=1327 audit(1769563368.157:812): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:48.179109 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 28 01:22:48.180000 audit[6264]: USER_START pid=6264 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.185000 audit[6268]: CRED_ACQ pid=6268 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.192974 kernel: audit: type=1105 audit(1769563368.180:813): pid=6264 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.193035 kernel: audit: type=1103 audit(1769563368.185:814): pid=6268 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.507149 sshd[6268]: Connection closed by 10.200.16.10 port 45788 Jan 28 01:22:48.508771 sshd-session[6264]: pam_unix(sshd:session): session closed for user core Jan 28 01:22:48.508000 audit[6264]: USER_END pid=6264 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.515494 systemd[1]: sshd@12-10.200.8.20:22-10.200.16.10:45788.service: Deactivated successfully. Jan 28 01:22:48.519995 kernel: audit: type=1106 audit(1769563368.508:815): pid=6264 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.520063 kernel: audit: type=1104 audit(1769563368.509:816): pid=6264 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.509000 audit[6264]: CRED_DISP pid=6264 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:48.517642 systemd[1]: session-16.scope: Deactivated successfully. Jan 28 01:22:48.521714 systemd-logind[2536]: Session 16 logged out. Waiting for processes to exit. Jan 28 01:22:48.514000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.200.8.20:22-10.200.16.10:45788 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:48.523159 systemd-logind[2536]: Removed session 16. Jan 28 01:22:48.741382 kubelet[4038]: E0128 01:22:48.740998 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:22:48.741809 containerd[2557]: time="2026-01-28T01:22:48.741145119Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 01:22:49.017978 containerd[2557]: time="2026-01-28T01:22:49.016711762Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:49.019322 containerd[2557]: time="2026-01-28T01:22:49.019193567Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 01:22:49.019322 containerd[2557]: time="2026-01-28T01:22:49.019305310Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:49.019462 kubelet[4038]: E0128 01:22:49.019424 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:22:49.019505 kubelet[4038]: E0128 01:22:49.019471 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 01:22:49.019813 kubelet[4038]: E0128 01:22:49.019757 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:49.021964 containerd[2557]: time="2026-01-28T01:22:49.021930896Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 01:22:49.257871 containerd[2557]: time="2026-01-28T01:22:49.257822902Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:49.262362 containerd[2557]: time="2026-01-28T01:22:49.262334060Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 01:22:49.262439 containerd[2557]: time="2026-01-28T01:22:49.262401939Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:49.262581 kubelet[4038]: E0128 01:22:49.262545 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:22:49.262631 kubelet[4038]: E0128 01:22:49.262593 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 01:22:49.262885 kubelet[4038]: E0128 01:22:49.262724 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-lcd4c_calico-system(74fe3431-17ca-4902-9eb5-64c3701d8bd6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:49.263902 kubelet[4038]: E0128 01:22:49.263871 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:22:49.745115 containerd[2557]: time="2026-01-28T01:22:49.745041584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 01:22:49.982363 containerd[2557]: time="2026-01-28T01:22:49.982309973Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:49.985072 containerd[2557]: time="2026-01-28T01:22:49.985042196Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 01:22:49.985157 containerd[2557]: time="2026-01-28T01:22:49.985055916Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:49.985238 kubelet[4038]: E0128 01:22:49.985205 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:22:49.986272 kubelet[4038]: E0128 01:22:49.985247 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 01:22:49.986272 kubelet[4038]: E0128 01:22:49.985405 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-kz629_calico-system(d68b64e3-e019-4732-8971-c8457279d8f6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:49.986871 kubelet[4038]: E0128 01:22:49.986843 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:22:53.620366 systemd[1]: Started sshd@13-10.200.8.20:22-10.200.16.10:51370.service - OpenSSH per-connection server daemon (10.200.16.10:51370). Jan 28 01:22:53.619000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.200.8.20:22-10.200.16.10:51370 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:53.623528 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:22:53.623617 kernel: audit: type=1130 audit(1769563373.619:818): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.200.8.20:22-10.200.16.10:51370 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:53.743624 containerd[2557]: time="2026-01-28T01:22:53.743087781Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 01:22:53.991920 containerd[2557]: time="2026-01-28T01:22:53.991883467Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:53.995041 containerd[2557]: time="2026-01-28T01:22:53.995008193Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 01:22:53.995143 containerd[2557]: time="2026-01-28T01:22:53.995088370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:53.996125 kubelet[4038]: E0128 01:22:53.996027 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:22:53.996125 kubelet[4038]: E0128 01:22:53.996084 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 01:22:53.996794 kubelet[4038]: E0128 01:22:53.996649 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0c0e29983b554c87af1b31cc149295e5,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:53.998708 containerd[2557]: time="2026-01-28T01:22:53.998681802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 01:22:54.175000 audit[6307]: USER_ACCT pid=6307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.180432 sshd-session[6307]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:22:54.178000 audit[6307]: CRED_ACQ pid=6307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.181648 sshd[6307]: Accepted publickey for core from 10.200.16.10 port 51370 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:22:54.185114 kernel: audit: type=1101 audit(1769563374.175:819): pid=6307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.185195 kernel: audit: type=1103 audit(1769563374.178:820): pid=6307 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.188332 kernel: audit: type=1006 audit(1769563374.178:821): pid=6307 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 28 01:22:54.193038 kernel: audit: type=1300 audit(1769563374.178:821): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe70ffdeb0 a2=3 a3=0 items=0 ppid=1 pid=6307 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:54.178000 audit[6307]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe70ffdeb0 a2=3 a3=0 items=0 ppid=1 pid=6307 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:22:54.178000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:54.195971 kernel: audit: type=1327 audit(1769563374.178:821): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:22:54.197004 systemd-logind[2536]: New session 17 of user core. Jan 28 01:22:54.204216 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 28 01:22:54.205000 audit[6307]: USER_START pid=6307 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.210000 audit[6311]: CRED_ACQ pid=6311 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.214750 kernel: audit: type=1105 audit(1769563374.205:822): pid=6307 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.214842 kernel: audit: type=1103 audit(1769563374.210:823): pid=6311 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.248962 containerd[2557]: time="2026-01-28T01:22:54.248589230Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:54.250956 containerd[2557]: time="2026-01-28T01:22:54.250921696Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 01:22:54.251014 containerd[2557]: time="2026-01-28T01:22:54.250997893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:54.251186 kubelet[4038]: E0128 01:22:54.251141 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:22:54.251234 kubelet[4038]: E0128 01:22:54.251188 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 01:22:54.251346 kubelet[4038]: E0128 01:22:54.251309 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sdzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6f6448978d-8pkwl_calico-system(71a93f75-99db-41f8-a193-bdcc3af98dc1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:54.252729 kubelet[4038]: E0128 01:22:54.252690 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:22:54.523506 sshd[6311]: Connection closed by 10.200.16.10 port 51370 Jan 28 01:22:54.524471 sshd-session[6307]: pam_unix(sshd:session): session closed for user core Jan 28 01:22:54.525000 audit[6307]: USER_END pid=6307 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.530315 systemd[1]: sshd@13-10.200.8.20:22-10.200.16.10:51370.service: Deactivated successfully. Jan 28 01:22:54.533258 systemd[1]: session-17.scope: Deactivated successfully. Jan 28 01:22:54.525000 audit[6307]: CRED_DISP pid=6307 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.534452 systemd-logind[2536]: Session 17 logged out. Waiting for processes to exit. Jan 28 01:22:54.537333 systemd-logind[2536]: Removed session 17. Jan 28 01:22:54.539169 kernel: audit: type=1106 audit(1769563374.525:824): pid=6307 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.539218 kernel: audit: type=1104 audit(1769563374.525:825): pid=6307 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:22:54.529000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.200.8.20:22-10.200.16.10:51370 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:54.741554 containerd[2557]: time="2026-01-28T01:22:54.741450207Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:22:54.981410 containerd[2557]: time="2026-01-28T01:22:54.981375149Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:54.983822 containerd[2557]: time="2026-01-28T01:22:54.983788535Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:22:54.983877 containerd[2557]: time="2026-01-28T01:22:54.983863293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:54.984053 kubelet[4038]: E0128 01:22:54.984022 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:22:54.984109 kubelet[4038]: E0128 01:22:54.984065 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:22:54.984224 kubelet[4038]: E0128 01:22:54.984195 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dkxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-85rm2_calico-apiserver(489aa6ff-974c-4c0f-ad71-b359b70146bf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:54.986341 kubelet[4038]: E0128 01:22:54.986186 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:22:57.744042 containerd[2557]: time="2026-01-28T01:22:57.743783786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 01:22:57.976730 containerd[2557]: time="2026-01-28T01:22:57.976691610Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:22:57.979222 containerd[2557]: time="2026-01-28T01:22:57.979174633Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 01:22:57.979222 containerd[2557]: time="2026-01-28T01:22:57.979201727Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 01:22:57.979402 kubelet[4038]: E0128 01:22:57.979334 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:22:57.979402 kubelet[4038]: E0128 01:22:57.979372 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 01:22:57.979737 kubelet[4038]: E0128 01:22:57.979505 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wb2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5496f89df7-4vb68_calico-system(8b2b158a-081b-4454-a96f-65445d9cadc6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 01:22:57.980968 kubelet[4038]: E0128 01:22:57.980922 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:22:59.648000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.200.8.20:22-10.200.16.10:56312 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:22:59.649316 systemd[1]: Started sshd@14-10.200.8.20:22-10.200.16.10:56312.service - OpenSSH per-connection server daemon (10.200.16.10:56312). Jan 28 01:22:59.651229 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:22:59.651495 kernel: audit: type=1130 audit(1769563379.648:827): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.200.8.20:22-10.200.16.10:56312 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:00.189000 audit[6344]: USER_ACCT pid=6344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.195974 kernel: audit: type=1101 audit(1769563380.189:828): pid=6344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.196519 sshd[6344]: Accepted publickey for core from 10.200.16.10 port 56312 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:00.199700 sshd-session[6344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:00.195000 audit[6344]: CRED_ACQ pid=6344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.205969 kernel: audit: type=1103 audit(1769563380.195:829): pid=6344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.212976 kernel: audit: type=1006 audit(1769563380.195:830): pid=6344 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 28 01:23:00.195000 audit[6344]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffde5269de0 a2=3 a3=0 items=0 ppid=1 pid=6344 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:00.216467 systemd-logind[2536]: New session 18 of user core. Jan 28 01:23:00.218965 kernel: audit: type=1300 audit(1769563380.195:830): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffde5269de0 a2=3 a3=0 items=0 ppid=1 pid=6344 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:00.195000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:00.221391 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 28 01:23:00.223110 kernel: audit: type=1327 audit(1769563380.195:830): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:00.223000 audit[6344]: USER_START pid=6344 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.229991 kernel: audit: type=1105 audit(1769563380.223:831): pid=6344 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.227000 audit[6348]: CRED_ACQ pid=6348 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.238011 kernel: audit: type=1103 audit(1769563380.227:832): pid=6348 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.541090 sshd[6348]: Connection closed by 10.200.16.10 port 56312 Jan 28 01:23:00.541895 sshd-session[6344]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:00.542000 audit[6344]: USER_END pid=6344 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.542000 audit[6344]: CRED_DISP pid=6344 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.550477 kernel: audit: type=1106 audit(1769563380.542:833): pid=6344 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.550553 kernel: audit: type=1104 audit(1769563380.542:834): pid=6344 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:00.550833 systemd[1]: sshd@14-10.200.8.20:22-10.200.16.10:56312.service: Deactivated successfully. Jan 28 01:23:00.550000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.200.8.20:22-10.200.16.10:56312 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:00.553716 systemd[1]: session-18.scope: Deactivated successfully. Jan 28 01:23:00.555662 systemd-logind[2536]: Session 18 logged out. Waiting for processes to exit. Jan 28 01:23:00.557044 systemd-logind[2536]: Removed session 18. Jan 28 01:23:00.660000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.200.8.20:22-10.200.16.10:56314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:00.661749 systemd[1]: Started sshd@15-10.200.8.20:22-10.200.16.10:56314.service - OpenSSH per-connection server daemon (10.200.16.10:56314). Jan 28 01:23:00.741123 containerd[2557]: time="2026-01-28T01:23:00.741089817Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 01:23:00.976549 containerd[2557]: time="2026-01-28T01:23:00.976404070Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 01:23:00.978921 containerd[2557]: time="2026-01-28T01:23:00.978856785Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 01:23:00.979015 containerd[2557]: time="2026-01-28T01:23:00.978964282Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 01:23:00.979199 kubelet[4038]: E0128 01:23:00.979152 4038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:23:00.979455 kubelet[4038]: E0128 01:23:00.979214 4038 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 01:23:00.979455 kubelet[4038]: E0128 01:23:00.979361 4038 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rjlfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-db6789d8-bg2b5_calico-apiserver(4b861133-0274-4274-bab9-748410e42edc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 01:23:00.980873 kubelet[4038]: E0128 01:23:00.980823 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:23:01.195000 audit[6360]: USER_ACCT pid=6360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:01.196684 sshd[6360]: Accepted publickey for core from 10.200.16.10 port 56314 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:01.196000 audit[6360]: CRED_ACQ pid=6360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:01.196000 audit[6360]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe476aea70 a2=3 a3=0 items=0 ppid=1 pid=6360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:01.196000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:01.198668 sshd-session[6360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:01.205823 systemd-logind[2536]: New session 19 of user core. Jan 28 01:23:01.213150 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 28 01:23:01.214000 audit[6360]: USER_START pid=6360 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:01.215000 audit[6364]: CRED_ACQ pid=6364 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:01.667113 sshd[6364]: Connection closed by 10.200.16.10 port 56314 Jan 28 01:23:01.668141 sshd-session[6360]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:01.669000 audit[6360]: USER_END pid=6360 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:01.669000 audit[6360]: CRED_DISP pid=6360 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:01.673664 systemd[1]: sshd@15-10.200.8.20:22-10.200.16.10:56314.service: Deactivated successfully. Jan 28 01:23:01.673000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.200.8.20:22-10.200.16.10:56314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:01.676489 systemd[1]: session-19.scope: Deactivated successfully. Jan 28 01:23:01.677520 systemd-logind[2536]: Session 19 logged out. Waiting for processes to exit. Jan 28 01:23:01.679750 systemd-logind[2536]: Removed session 19. Jan 28 01:23:01.780000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.200.8.20:22-10.200.16.10:56318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:01.780847 systemd[1]: Started sshd@16-10.200.8.20:22-10.200.16.10:56318.service - OpenSSH per-connection server daemon (10.200.16.10:56318). Jan 28 01:23:02.337000 audit[6374]: USER_ACCT pid=6374 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:02.338460 sshd[6374]: Accepted publickey for core from 10.200.16.10 port 56318 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:02.338000 audit[6374]: CRED_ACQ pid=6374 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:02.338000 audit[6374]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff204bedd0 a2=3 a3=0 items=0 ppid=1 pid=6374 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:02.338000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:02.339944 sshd-session[6374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:02.344201 systemd-logind[2536]: New session 20 of user core. Jan 28 01:23:02.347095 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 28 01:23:02.348000 audit[6374]: USER_START pid=6374 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:02.350000 audit[6378]: CRED_ACQ pid=6378 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:02.743231 kubelet[4038]: E0128 01:23:02.742881 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:23:03.201000 audit[6389]: NETFILTER_CFG table=filter:145 family=2 entries=26 op=nft_register_rule pid=6389 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:23:03.201000 audit[6389]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffca087d140 a2=0 a3=7ffca087d12c items=0 ppid=4145 pid=6389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:03.201000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:23:03.206000 audit[6389]: NETFILTER_CFG table=nat:146 family=2 entries=20 op=nft_register_rule pid=6389 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:23:03.206000 audit[6389]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffca087d140 a2=0 a3=0 items=0 ppid=4145 pid=6389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:03.206000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:23:03.224000 audit[6391]: NETFILTER_CFG table=filter:147 family=2 entries=38 op=nft_register_rule pid=6391 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:23:03.224000 audit[6391]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffdd535fa20 a2=0 a3=7ffdd535fa0c items=0 ppid=4145 pid=6391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:03.224000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:23:03.231000 audit[6391]: NETFILTER_CFG table=nat:148 family=2 entries=20 op=nft_register_rule pid=6391 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:23:03.231000 audit[6391]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffdd535fa20 a2=0 a3=0 items=0 ppid=4145 pid=6391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:03.231000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:23:03.322454 sshd[6378]: Connection closed by 10.200.16.10 port 56318 Jan 28 01:23:03.322872 sshd-session[6374]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:03.324000 audit[6374]: USER_END pid=6374 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:03.324000 audit[6374]: CRED_DISP pid=6374 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:03.327915 systemd[1]: sshd@16-10.200.8.20:22-10.200.16.10:56318.service: Deactivated successfully. Jan 28 01:23:03.327000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.200.8.20:22-10.200.16.10:56318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:03.329911 systemd[1]: session-20.scope: Deactivated successfully. Jan 28 01:23:03.330886 systemd-logind[2536]: Session 20 logged out. Waiting for processes to exit. Jan 28 01:23:03.333709 systemd-logind[2536]: Removed session 20. Jan 28 01:23:03.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.200.8.20:22-10.200.16.10:56330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:03.436750 systemd[1]: Started sshd@17-10.200.8.20:22-10.200.16.10:56330.service - OpenSSH per-connection server daemon (10.200.16.10:56330). Jan 28 01:23:03.989000 audit[6396]: USER_ACCT pid=6396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:03.991888 sshd[6396]: Accepted publickey for core from 10.200.16.10 port 56330 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:03.990000 audit[6396]: CRED_ACQ pid=6396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:03.990000 audit[6396]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff0ed299b0 a2=3 a3=0 items=0 ppid=1 pid=6396 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:03.990000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:03.993401 sshd-session[6396]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:03.999162 systemd-logind[2536]: New session 21 of user core. Jan 28 01:23:04.006298 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 28 01:23:04.008000 audit[6396]: USER_START pid=6396 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:04.012000 audit[6400]: CRED_ACQ pid=6400 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:04.436442 sshd[6400]: Connection closed by 10.200.16.10 port 56330 Jan 28 01:23:04.437126 sshd-session[6396]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:04.437000 audit[6396]: USER_END pid=6396 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:04.437000 audit[6396]: CRED_DISP pid=6396 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:04.442992 systemd-logind[2536]: Session 21 logged out. Waiting for processes to exit. Jan 28 01:23:04.444300 systemd[1]: sshd@17-10.200.8.20:22-10.200.16.10:56330.service: Deactivated successfully. Jan 28 01:23:04.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.200.8.20:22-10.200.16.10:56330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:04.446918 systemd[1]: session-21.scope: Deactivated successfully. Jan 28 01:23:04.449288 systemd-logind[2536]: Removed session 21. Jan 28 01:23:04.555805 systemd[1]: Started sshd@18-10.200.8.20:22-10.200.16.10:56340.service - OpenSSH per-connection server daemon (10.200.16.10:56340). Jan 28 01:23:04.553000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.200.8.20:22-10.200.16.10:56340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:04.741484 kubelet[4038]: E0128 01:23:04.741382 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:23:05.112767 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 28 01:23:05.112888 kernel: audit: type=1101 audit(1769563385.103:868): pid=6411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.103000 audit[6411]: USER_ACCT pid=6411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.113310 sshd[6411]: Accepted publickey for core from 10.200.16.10 port 56340 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:05.108405 sshd-session[6411]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:05.105000 audit[6411]: CRED_ACQ pid=6411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.124471 systemd-logind[2536]: New session 22 of user core. Jan 28 01:23:05.125971 kernel: audit: type=1103 audit(1769563385.105:869): pid=6411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.128579 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 28 01:23:05.131973 kernel: audit: type=1006 audit(1769563385.105:870): pid=6411 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 28 01:23:05.105000 audit[6411]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdda33db20 a2=3 a3=0 items=0 ppid=1 pid=6411 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:05.141973 kernel: audit: type=1300 audit(1769563385.105:870): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdda33db20 a2=3 a3=0 items=0 ppid=1 pid=6411 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:05.105000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:05.145968 kernel: audit: type=1327 audit(1769563385.105:870): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:05.132000 audit[6411]: USER_START pid=6411 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.161989 kernel: audit: type=1105 audit(1769563385.132:871): pid=6411 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.162058 kernel: audit: type=1103 audit(1769563385.133:872): pid=6415 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.133000 audit[6415]: CRED_ACQ pid=6415 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.453278 sshd[6415]: Connection closed by 10.200.16.10 port 56340 Jan 28 01:23:05.454928 sshd-session[6411]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:05.455000 audit[6411]: USER_END pid=6411 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.461575 systemd[1]: sshd@18-10.200.8.20:22-10.200.16.10:56340.service: Deactivated successfully. Jan 28 01:23:05.466646 kernel: audit: type=1106 audit(1769563385.455:873): pid=6411 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.466718 kernel: audit: type=1104 audit(1769563385.455:874): pid=6411 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.455000 audit[6411]: CRED_DISP pid=6411 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:05.465851 systemd[1]: session-22.scope: Deactivated successfully. Jan 28 01:23:05.458000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.200.8.20:22-10.200.16.10:56340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:05.469839 systemd-logind[2536]: Session 22 logged out. Waiting for processes to exit. Jan 28 01:23:05.471396 systemd-logind[2536]: Removed session 22. Jan 28 01:23:05.471988 kernel: audit: type=1131 audit(1769563385.458:875): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.200.8.20:22-10.200.16.10:56340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:06.744064 kubelet[4038]: E0128 01:23:06.743495 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:23:07.957000 audit[6427]: NETFILTER_CFG table=filter:149 family=2 entries=26 op=nft_register_rule pid=6427 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:23:07.957000 audit[6427]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffdc48aa130 a2=0 a3=7ffdc48aa11c items=0 ppid=4145 pid=6427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:07.957000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:23:07.963000 audit[6427]: NETFILTER_CFG table=nat:150 family=2 entries=104 op=nft_register_chain pid=6427 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 01:23:07.963000 audit[6427]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffdc48aa130 a2=0 a3=7ffdc48aa11c items=0 ppid=4145 pid=6427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:07.963000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 01:23:09.746614 kubelet[4038]: E0128 01:23:09.746550 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:23:09.748169 kubelet[4038]: E0128 01:23:09.747311 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:23:10.567103 systemd[1]: Started sshd@19-10.200.8.20:22-10.200.16.10:39552.service - OpenSSH per-connection server daemon (10.200.16.10:39552). Jan 28 01:23:10.572832 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 28 01:23:10.572917 kernel: audit: type=1130 audit(1769563390.566:878): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.200.8.20:22-10.200.16.10:39552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:10.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.200.8.20:22-10.200.16.10:39552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:11.109299 sshd[6429]: Accepted publickey for core from 10.200.16.10 port 39552 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:11.115483 kernel: audit: type=1101 audit(1769563391.108:879): pid=6429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.108000 audit[6429]: USER_ACCT pid=6429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.116727 sshd-session[6429]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:11.114000 audit[6429]: CRED_ACQ pid=6429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.124723 systemd-logind[2536]: New session 23 of user core. Jan 28 01:23:11.128465 kernel: audit: type=1103 audit(1769563391.114:880): pid=6429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.128533 kernel: audit: type=1006 audit(1769563391.114:881): pid=6429 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 28 01:23:11.114000 audit[6429]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff53a56cf0 a2=3 a3=0 items=0 ppid=1 pid=6429 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:11.134836 kernel: audit: type=1300 audit(1769563391.114:881): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff53a56cf0 a2=3 a3=0 items=0 ppid=1 pid=6429 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:11.135222 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 28 01:23:11.114000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:11.140991 kernel: audit: type=1327 audit(1769563391.114:881): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:11.142000 audit[6429]: USER_START pid=6429 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.153982 kernel: audit: type=1105 audit(1769563391.142:882): pid=6429 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.143000 audit[6433]: CRED_ACQ pid=6433 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.158986 kernel: audit: type=1103 audit(1769563391.143:883): pid=6433 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.492835 sshd[6433]: Connection closed by 10.200.16.10 port 39552 Jan 28 01:23:11.493327 sshd-session[6429]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:11.493000 audit[6429]: USER_END pid=6429 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.503087 kernel: audit: type=1106 audit(1769563391.493:884): pid=6429 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.503636 systemd[1]: sshd@19-10.200.8.20:22-10.200.16.10:39552.service: Deactivated successfully. Jan 28 01:23:11.506373 systemd[1]: session-23.scope: Deactivated successfully. Jan 28 01:23:11.507363 systemd-logind[2536]: Session 23 logged out. Waiting for processes to exit. Jan 28 01:23:11.493000 audit[6429]: CRED_DISP pid=6429 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:11.512469 systemd-logind[2536]: Removed session 23. Jan 28 01:23:11.502000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.200.8.20:22-10.200.16.10:39552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:11.514965 kernel: audit: type=1104 audit(1769563391.493:885): pid=6429 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:14.742059 kubelet[4038]: E0128 01:23:14.742022 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:23:16.616000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.200.8.20:22-10.200.16.10:39568 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:16.616879 systemd[1]: Started sshd@20-10.200.8.20:22-10.200.16.10:39568.service - OpenSSH per-connection server daemon (10.200.16.10:39568). Jan 28 01:23:16.618636 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:23:16.618691 kernel: audit: type=1130 audit(1769563396.616:887): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.200.8.20:22-10.200.16.10:39568 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:17.159000 audit[6445]: USER_ACCT pid=6445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.162469 sshd-session[6445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:17.163625 sshd[6445]: Accepted publickey for core from 10.200.16.10 port 39568 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:17.175200 kernel: audit: type=1101 audit(1769563397.159:888): pid=6445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.159000 audit[6445]: CRED_ACQ pid=6445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.182449 kernel: audit: type=1103 audit(1769563397.159:889): pid=6445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.182527 kernel: audit: type=1006 audit(1769563397.159:890): pid=6445 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 28 01:23:17.180460 systemd-logind[2536]: New session 24 of user core. Jan 28 01:23:17.186936 kernel: audit: type=1300 audit(1769563397.159:890): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffceb4ba470 a2=3 a3=0 items=0 ppid=1 pid=6445 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:17.159000 audit[6445]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffceb4ba470 a2=3 a3=0 items=0 ppid=1 pid=6445 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:17.187270 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 28 01:23:17.159000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:17.190964 kernel: audit: type=1327 audit(1769563397.159:890): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:17.192000 audit[6445]: USER_START pid=6445 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.198978 kernel: audit: type=1105 audit(1769563397.192:891): pid=6445 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.198000 audit[6451]: CRED_ACQ pid=6451 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.207967 kernel: audit: type=1103 audit(1769563397.198:892): pid=6451 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.528673 sshd[6451]: Connection closed by 10.200.16.10 port 39568 Jan 28 01:23:17.530035 sshd-session[6445]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:17.531000 audit[6445]: USER_END pid=6445 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.537967 kernel: audit: type=1106 audit(1769563397.531:893): pid=6445 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.537601 systemd[1]: sshd@20-10.200.8.20:22-10.200.16.10:39568.service: Deactivated successfully. Jan 28 01:23:17.541735 systemd[1]: session-24.scope: Deactivated successfully. Jan 28 01:23:17.531000 audit[6445]: CRED_DISP pid=6445 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.546089 systemd-logind[2536]: Session 24 logged out. Waiting for processes to exit. Jan 28 01:23:17.547002 kernel: audit: type=1104 audit(1769563397.531:894): pid=6445 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:17.547379 systemd-logind[2536]: Removed session 24. Jan 28 01:23:17.536000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.200.8.20:22-10.200.16.10:39568 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:17.742755 kubelet[4038]: E0128 01:23:17.742717 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:23:17.744532 kubelet[4038]: E0128 01:23:17.744377 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:23:17.746711 kubelet[4038]: E0128 01:23:17.746640 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:23:21.743341 kubelet[4038]: E0128 01:23:21.743260 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:23:22.649722 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:23:22.649938 kernel: audit: type=1130 audit(1769563402.645:896): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.200.8.20:22-10.200.16.10:53486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:22.645000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.200.8.20:22-10.200.16.10:53486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:22.646224 systemd[1]: Started sshd@21-10.200.8.20:22-10.200.16.10:53486.service - OpenSSH per-connection server daemon (10.200.16.10:53486). Jan 28 01:23:22.744516 kubelet[4038]: E0128 01:23:22.744477 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1" Jan 28 01:23:23.185000 audit[6487]: USER_ACCT pid=6487 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.188476 sshd[6487]: Accepted publickey for core from 10.200.16.10 port 53486 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:23.190182 sshd-session[6487]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:23.185000 audit[6487]: CRED_ACQ pid=6487 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.196620 kernel: audit: type=1101 audit(1769563403.185:897): pid=6487 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.196686 kernel: audit: type=1103 audit(1769563403.185:898): pid=6487 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.201597 kernel: audit: type=1006 audit(1769563403.185:899): pid=6487 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 28 01:23:23.200665 systemd-logind[2536]: New session 25 of user core. Jan 28 01:23:23.185000 audit[6487]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc4bbfcb70 a2=3 a3=0 items=0 ppid=1 pid=6487 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:23.206594 kernel: audit: type=1300 audit(1769563403.185:899): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc4bbfcb70 a2=3 a3=0 items=0 ppid=1 pid=6487 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:23.185000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:23.209425 kernel: audit: type=1327 audit(1769563403.185:899): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:23.210176 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 28 01:23:23.212000 audit[6487]: USER_START pid=6487 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.218000 audit[6491]: CRED_ACQ pid=6491 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.221180 kernel: audit: type=1105 audit(1769563403.212:900): pid=6487 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.221222 kernel: audit: type=1103 audit(1769563403.218:901): pid=6491 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.537809 sshd[6491]: Connection closed by 10.200.16.10 port 53486 Jan 28 01:23:23.540217 sshd-session[6487]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:23.540000 audit[6487]: USER_END pid=6487 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.545920 systemd[1]: sshd@21-10.200.8.20:22-10.200.16.10:53486.service: Deactivated successfully. Jan 28 01:23:23.551619 kernel: audit: type=1106 audit(1769563403.540:902): pid=6487 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.551684 kernel: audit: type=1104 audit(1769563403.540:903): pid=6487 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.540000 audit[6487]: CRED_DISP pid=6487 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:23.550387 systemd[1]: session-25.scope: Deactivated successfully. Jan 28 01:23:23.551937 systemd-logind[2536]: Session 25 logged out. Waiting for processes to exit. Jan 28 01:23:23.545000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.200.8.20:22-10.200.16.10:53486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:23.552973 systemd-logind[2536]: Removed session 25. Jan 28 01:23:25.743136 kubelet[4038]: E0128 01:23:25.743097 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-bg2b5" podUID="4b861133-0274-4274-bab9-748410e42edc" Jan 28 01:23:28.661016 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:23:28.661146 kernel: audit: type=1130 audit(1769563408.652:905): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.200.8.20:22-10.200.16.10:53492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:28.652000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.200.8.20:22-10.200.16.10:53492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:28.653595 systemd[1]: Started sshd@22-10.200.8.20:22-10.200.16.10:53492.service - OpenSSH per-connection server daemon (10.200.16.10:53492). Jan 28 01:23:28.744467 kubelet[4038]: E0128 01:23:28.744426 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-db6789d8-85rm2" podUID="489aa6ff-974c-4c0f-ad71-b359b70146bf" Jan 28 01:23:28.747862 kubelet[4038]: E0128 01:23:28.745628 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-kz629" podUID="d68b64e3-e019-4732-8971-c8457279d8f6" Jan 28 01:23:29.226000 audit[6503]: USER_ACCT pid=6503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.234006 systemd-logind[2536]: New session 26 of user core. Jan 28 01:23:29.229120 sshd-session[6503]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:29.234667 sshd[6503]: Accepted publickey for core from 10.200.16.10 port 53492 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:29.235690 kernel: audit: type=1101 audit(1769563409.226:906): pid=6503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.226000 audit[6503]: CRED_ACQ pid=6503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.241138 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 28 01:23:29.242876 kernel: audit: type=1103 audit(1769563409.226:907): pid=6503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.242926 kernel: audit: type=1006 audit(1769563409.226:908): pid=6503 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 28 01:23:29.226000 audit[6503]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe25e2290 a2=3 a3=0 items=0 ppid=1 pid=6503 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:29.247666 kernel: audit: type=1300 audit(1769563409.226:908): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe25e2290 a2=3 a3=0 items=0 ppid=1 pid=6503 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:29.226000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:29.250675 kernel: audit: type=1327 audit(1769563409.226:908): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:29.243000 audit[6503]: USER_START pid=6503 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.255031 kernel: audit: type=1105 audit(1769563409.243:909): pid=6503 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.246000 audit[6508]: CRED_ACQ pid=6508 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.259404 kernel: audit: type=1103 audit(1769563409.246:910): pid=6508 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.574058 sshd[6508]: Connection closed by 10.200.16.10 port 53492 Jan 28 01:23:29.575420 sshd-session[6503]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:29.575000 audit[6503]: USER_END pid=6503 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.582592 systemd[1]: sshd@22-10.200.8.20:22-10.200.16.10:53492.service: Deactivated successfully. Jan 28 01:23:29.587356 kernel: audit: type=1106 audit(1769563409.575:911): pid=6503 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.587417 kernel: audit: type=1104 audit(1769563409.575:912): pid=6503 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.575000 audit[6503]: CRED_DISP pid=6503 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:29.584975 systemd[1]: session-26.scope: Deactivated successfully. Jan 28 01:23:29.588405 systemd-logind[2536]: Session 26 logged out. Waiting for processes to exit. Jan 28 01:23:29.581000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.200.8.20:22-10.200.16.10:53492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:29.589395 systemd-logind[2536]: Removed session 26. Jan 28 01:23:30.742499 kubelet[4038]: E0128 01:23:30.742447 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lcd4c" podUID="74fe3431-17ca-4902-9eb5-64c3701d8bd6" Jan 28 01:23:34.692452 systemd[1]: Started sshd@23-10.200.8.20:22-10.200.16.10:58466.service - OpenSSH per-connection server daemon (10.200.16.10:58466). Jan 28 01:23:34.700081 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 01:23:34.700113 kernel: audit: type=1130 audit(1769563414.691:914): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.200.8.20:22-10.200.16.10:58466 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:34.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.200.8.20:22-10.200.16.10:58466 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:35.244000 audit[6520]: USER_ACCT pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.249932 sshd[6520]: Accepted publickey for core from 10.200.16.10 port 58466 ssh2: RSA SHA256:VtHR68MlTOJLpw2VkW0PUocZAiO696J+LJJgc3ffFuY Jan 28 01:23:35.250231 kernel: audit: type=1101 audit(1769563415.244:915): pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.251915 sshd-session[6520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 01:23:35.250000 audit[6520]: CRED_ACQ pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.257021 kernel: audit: type=1103 audit(1769563415.250:916): pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.263001 kernel: audit: type=1006 audit(1769563415.250:917): pid=6520 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 28 01:23:35.250000 audit[6520]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd47d3fa50 a2=3 a3=0 items=0 ppid=1 pid=6520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:35.264827 systemd-logind[2536]: New session 27 of user core. Jan 28 01:23:35.270611 kernel: audit: type=1300 audit(1769563415.250:917): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd47d3fa50 a2=3 a3=0 items=0 ppid=1 pid=6520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 01:23:35.250000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:35.274967 kernel: audit: type=1327 audit(1769563415.250:917): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 01:23:35.276394 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 28 01:23:35.278000 audit[6520]: USER_START pid=6520 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.286973 kernel: audit: type=1105 audit(1769563415.278:918): pid=6520 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.284000 audit[6524]: CRED_ACQ pid=6524 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.293971 kernel: audit: type=1103 audit(1769563415.284:919): pid=6524 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.610098 sshd[6524]: Connection closed by 10.200.16.10 port 58466 Jan 28 01:23:35.610888 sshd-session[6520]: pam_unix(sshd:session): session closed for user core Jan 28 01:23:35.611000 audit[6520]: USER_END pid=6520 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.617132 systemd[1]: sshd@23-10.200.8.20:22-10.200.16.10:58466.service: Deactivated successfully. Jan 28 01:23:35.611000 audit[6520]: CRED_DISP pid=6520 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.619820 kernel: audit: type=1106 audit(1769563415.611:920): pid=6520 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.619887 kernel: audit: type=1104 audit(1769563415.611:921): pid=6520 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Jan 28 01:23:35.616000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.200.8.20:22-10.200.16.10:58466 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 01:23:35.621680 systemd[1]: session-27.scope: Deactivated successfully. Jan 28 01:23:35.622851 systemd-logind[2536]: Session 27 logged out. Waiting for processes to exit. Jan 28 01:23:35.623923 systemd-logind[2536]: Removed session 27. Jan 28 01:23:35.741567 kubelet[4038]: E0128 01:23:35.741415 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5496f89df7-4vb68" podUID="8b2b158a-081b-4454-a96f-65445d9cadc6" Jan 28 01:23:36.742101 kubelet[4038]: E0128 01:23:36.742057 4038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6f6448978d-8pkwl" podUID="71a93f75-99db-41f8-a193-bdcc3af98dc1"