Jan 27 05:54:47.031024 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Jan 27 03:09:34 -00 2026 Jan 27 05:54:47.031068 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=94a0aed2c135ea3629cf7bc829842658bafc4ce682f9974c582239b9a4f2cb9e Jan 27 05:54:47.031092 kernel: BIOS-provided physical RAM map: Jan 27 05:54:47.031107 kernel: BIOS-e820: [mem 0x0000000000000000-0x0000000000000fff] reserved Jan 27 05:54:47.031121 kernel: BIOS-e820: [mem 0x0000000000001000-0x0000000000054fff] usable Jan 27 05:54:47.031134 kernel: BIOS-e820: [mem 0x0000000000055000-0x000000000005ffff] reserved Jan 27 05:54:47.031150 kernel: BIOS-e820: [mem 0x0000000000060000-0x0000000000097fff] usable Jan 27 05:54:47.031166 kernel: BIOS-e820: [mem 0x0000000000098000-0x000000000009ffff] reserved Jan 27 05:54:47.031181 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bd2e4fff] usable Jan 27 05:54:47.031200 kernel: BIOS-e820: [mem 0x00000000bd2e5000-0x00000000bd2eefff] ACPI data Jan 27 05:54:47.031215 kernel: BIOS-e820: [mem 0x00000000bd2ef000-0x00000000bf8ecfff] usable Jan 27 05:54:47.031239 kernel: BIOS-e820: [mem 0x00000000bf8ed000-0x00000000bfb6cfff] reserved Jan 27 05:54:47.031254 kernel: BIOS-e820: [mem 0x00000000bfb6d000-0x00000000bfb7efff] ACPI data Jan 27 05:54:47.031270 kernel: BIOS-e820: [mem 0x00000000bfb7f000-0x00000000bfbfefff] ACPI NVS Jan 27 05:54:47.031291 kernel: BIOS-e820: [mem 0x00000000bfbff000-0x00000000bffdffff] usable Jan 27 05:54:47.031309 kernel: BIOS-e820: [mem 0x00000000bffe0000-0x00000000bfffffff] reserved Jan 27 05:54:47.031326 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000021fffffff] usable Jan 27 05:54:47.031343 kernel: NX (Execute Disable) protection: active Jan 27 05:54:47.031359 kernel: APIC: Static calls initialized Jan 27 05:54:47.031376 kernel: efi: EFI v2.7 by EDK II Jan 27 05:54:47.031412 kernel: efi: TPMFinalLog=0xbfbf7000 ACPI=0xbfb7e000 ACPI 2.0=0xbfb7e014 SMBIOS=0xbf9ca000 MEMATTR=0xbd2ef018 RNG=0xbfb73018 TPMEventLog=0xbd2e5018 Jan 27 05:54:47.031429 kernel: random: crng init done Jan 27 05:54:47.031446 kernel: secureboot: Secure boot disabled Jan 27 05:54:47.031463 kernel: SMBIOS 2.4 present. Jan 27 05:54:47.031484 kernel: DMI: Google Google Compute Engine/Google Compute Engine, BIOS Google 10/25/2025 Jan 27 05:54:47.031500 kernel: DMI: Memory slots populated: 1/1 Jan 27 05:54:47.031517 kernel: Hypervisor detected: KVM Jan 27 05:54:47.031533 kernel: last_pfn = 0xbffe0 max_arch_pfn = 0x400000000 Jan 27 05:54:47.031549 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 27 05:54:47.031564 kernel: kvm-clock: using sched offset of 11181482719 cycles Jan 27 05:54:47.031580 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 27 05:54:47.031595 kernel: tsc: Detected 2299.998 MHz processor Jan 27 05:54:47.031611 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 27 05:54:47.031633 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 27 05:54:47.031649 kernel: last_pfn = 0x220000 max_arch_pfn = 0x400000000 Jan 27 05:54:47.031665 kernel: MTRR map: 3 entries (2 fixed + 1 variable; max 18), built from 8 variable MTRRs Jan 27 05:54:47.031683 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 27 05:54:47.031699 kernel: last_pfn = 0xbffe0 max_arch_pfn = 0x400000000 Jan 27 05:54:47.031716 kernel: Using GB pages for direct mapping Jan 27 05:54:47.031733 kernel: ACPI: Early table checksum verification disabled Jan 27 05:54:47.031759 kernel: ACPI: RSDP 0x00000000BFB7E014 000024 (v02 Google) Jan 27 05:54:47.031777 kernel: ACPI: XSDT 0x00000000BFB7D0E8 00005C (v01 Google GOOGFACP 00000001 01000013) Jan 27 05:54:47.031795 kernel: ACPI: FACP 0x00000000BFB78000 0000F4 (v02 Google GOOGFACP 00000001 GOOG 00000001) Jan 27 05:54:47.031812 kernel: ACPI: DSDT 0x00000000BFB79000 001A64 (v01 Google GOOGDSDT 00000001 GOOG 00000001) Jan 27 05:54:47.031830 kernel: ACPI: FACS 0x00000000BFBF2000 000040 Jan 27 05:54:47.031851 kernel: ACPI: SSDT 0x00000000BFB7C000 000316 (v02 GOOGLE Tpm2Tabl 00001000 INTL 20250404) Jan 27 05:54:47.031869 kernel: ACPI: TPM2 0x00000000BFB7B000 000034 (v04 GOOGLE 00000001 GOOG 00000001) Jan 27 05:54:47.031886 kernel: ACPI: SRAT 0x00000000BFB77000 0000C8 (v03 Google GOOGSRAT 00000001 GOOG 00000001) Jan 27 05:54:47.031904 kernel: ACPI: APIC 0x00000000BFB76000 000076 (v05 Google GOOGAPIC 00000001 GOOG 00000001) Jan 27 05:54:47.031922 kernel: ACPI: SSDT 0x00000000BFB75000 000980 (v01 Google GOOGSSDT 00000001 GOOG 00000001) Jan 27 05:54:47.031939 kernel: ACPI: WAET 0x00000000BFB74000 000028 (v01 Google GOOGWAET 00000001 GOOG 00000001) Jan 27 05:54:47.031960 kernel: ACPI: Reserving FACP table memory at [mem 0xbfb78000-0xbfb780f3] Jan 27 05:54:47.031977 kernel: ACPI: Reserving DSDT table memory at [mem 0xbfb79000-0xbfb7aa63] Jan 27 05:54:47.031995 kernel: ACPI: Reserving FACS table memory at [mem 0xbfbf2000-0xbfbf203f] Jan 27 05:54:47.032013 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb7c000-0xbfb7c315] Jan 27 05:54:47.032030 kernel: ACPI: Reserving TPM2 table memory at [mem 0xbfb7b000-0xbfb7b033] Jan 27 05:54:47.032048 kernel: ACPI: Reserving SRAT table memory at [mem 0xbfb77000-0xbfb770c7] Jan 27 05:54:47.032066 kernel: ACPI: Reserving APIC table memory at [mem 0xbfb76000-0xbfb76075] Jan 27 05:54:47.032086 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb75000-0xbfb7597f] Jan 27 05:54:47.032104 kernel: ACPI: Reserving WAET table memory at [mem 0xbfb74000-0xbfb74027] Jan 27 05:54:47.032121 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 27 05:54:47.032138 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0xbfffffff] Jan 27 05:54:47.032156 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x21fffffff] Jan 27 05:54:47.032174 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0xbfffffff] -> [mem 0x00001000-0xbfffffff] Jan 27 05:54:47.032192 kernel: NUMA: Node 0 [mem 0x00001000-0xbfffffff] + [mem 0x100000000-0x21fffffff] -> [mem 0x00001000-0x21fffffff] Jan 27 05:54:47.032210 kernel: NODE_DATA(0) allocated [mem 0x21fff8dc0-0x21fffffff] Jan 27 05:54:47.032238 kernel: Zone ranges: Jan 27 05:54:47.032256 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 27 05:54:47.032273 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 27 05:54:47.032291 kernel: Normal [mem 0x0000000100000000-0x000000021fffffff] Jan 27 05:54:47.032308 kernel: Device empty Jan 27 05:54:47.032325 kernel: Movable zone start for each node Jan 27 05:54:47.032343 kernel: Early memory node ranges Jan 27 05:54:47.032364 kernel: node 0: [mem 0x0000000000001000-0x0000000000054fff] Jan 27 05:54:47.033873 kernel: node 0: [mem 0x0000000000060000-0x0000000000097fff] Jan 27 05:54:47.033904 kernel: node 0: [mem 0x0000000000100000-0x00000000bd2e4fff] Jan 27 05:54:47.033924 kernel: node 0: [mem 0x00000000bd2ef000-0x00000000bf8ecfff] Jan 27 05:54:47.033942 kernel: node 0: [mem 0x00000000bfbff000-0x00000000bffdffff] Jan 27 05:54:47.034082 kernel: node 0: [mem 0x0000000100000000-0x000000021fffffff] Jan 27 05:54:47.034101 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000021fffffff] Jan 27 05:54:47.034119 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 27 05:54:47.034144 kernel: On node 0, zone DMA: 11 pages in unavailable ranges Jan 27 05:54:47.034289 kernel: On node 0, zone DMA: 104 pages in unavailable ranges Jan 27 05:54:47.034308 kernel: On node 0, zone DMA32: 10 pages in unavailable ranges Jan 27 05:54:47.034327 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 27 05:54:47.034345 kernel: On node 0, zone Normal: 32 pages in unavailable ranges Jan 27 05:54:47.034364 kernel: ACPI: PM-Timer IO Port: 0xb008 Jan 27 05:54:47.034518 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 27 05:54:47.034542 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 27 05:54:47.034560 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 27 05:54:47.034579 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 27 05:54:47.034718 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 27 05:54:47.034736 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 27 05:54:47.034755 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 27 05:54:47.034774 kernel: CPU topo: Max. logical packages: 1 Jan 27 05:54:47.034845 kernel: CPU topo: Max. logical dies: 1 Jan 27 05:54:47.034864 kernel: CPU topo: Max. dies per package: 1 Jan 27 05:54:47.034884 kernel: CPU topo: Max. threads per core: 2 Jan 27 05:54:47.034903 kernel: CPU topo: Num. cores per package: 1 Jan 27 05:54:47.034921 kernel: CPU topo: Num. threads per package: 2 Jan 27 05:54:47.034940 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 27 05:54:47.034959 kernel: [mem 0xc0000000-0xffffffff] available for PCI devices Jan 27 05:54:47.034978 kernel: Booting paravirtualized kernel on KVM Jan 27 05:54:47.035001 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 27 05:54:47.035021 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 27 05:54:47.035040 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 27 05:54:47.035059 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 27 05:54:47.035077 kernel: pcpu-alloc: [0] 0 1 Jan 27 05:54:47.035096 kernel: kvm-guest: PV spinlocks enabled Jan 27 05:54:47.035115 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 27 05:54:47.035140 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=94a0aed2c135ea3629cf7bc829842658bafc4ce682f9974c582239b9a4f2cb9e Jan 27 05:54:47.035159 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 27 05:54:47.035178 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 27 05:54:47.035198 kernel: Fallback order for Node 0: 0 Jan 27 05:54:47.035217 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1965136 Jan 27 05:54:47.035241 kernel: Policy zone: Normal Jan 27 05:54:47.035264 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 27 05:54:47.035283 kernel: software IO TLB: area num 2. Jan 27 05:54:47.035315 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 27 05:54:47.035339 kernel: Kernel/User page tables isolation: enabled Jan 27 05:54:47.035359 kernel: ftrace: allocating 40128 entries in 157 pages Jan 27 05:54:47.035391 kernel: ftrace: allocated 157 pages with 5 groups Jan 27 05:54:47.035410 kernel: Dynamic Preempt: voluntary Jan 27 05:54:47.035429 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 27 05:54:47.035451 kernel: rcu: RCU event tracing is enabled. Jan 27 05:54:47.035471 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 27 05:54:47.035496 kernel: Trampoline variant of Tasks RCU enabled. Jan 27 05:54:47.035516 kernel: Rude variant of Tasks RCU enabled. Jan 27 05:54:47.035536 kernel: Tracing variant of Tasks RCU enabled. Jan 27 05:54:47.035554 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 27 05:54:47.035578 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 27 05:54:47.035596 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 27 05:54:47.035615 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 27 05:54:47.035633 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 27 05:54:47.035652 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 27 05:54:47.035671 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 27 05:54:47.035689 kernel: Console: colour dummy device 80x25 Jan 27 05:54:47.035711 kernel: printk: legacy console [ttyS0] enabled Jan 27 05:54:47.035729 kernel: ACPI: Core revision 20240827 Jan 27 05:54:47.035748 kernel: APIC: Switch to symmetric I/O mode setup Jan 27 05:54:47.035767 kernel: x2apic enabled Jan 27 05:54:47.035786 kernel: APIC: Switched APIC routing to: physical x2apic Jan 27 05:54:47.035805 kernel: ..TIMER: vector=0x30 apic1=0 pin1=0 apic2=-1 pin2=-1 Jan 27 05:54:47.035823 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Jan 27 05:54:47.035847 kernel: Calibrating delay loop (skipped) preset value.. 4599.99 BogoMIPS (lpj=2299998) Jan 27 05:54:47.035867 kernel: Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 1024 Jan 27 05:54:47.035887 kernel: Last level dTLB entries: 4KB 1024, 2MB 1024, 4MB 1024, 1GB 4 Jan 27 05:54:47.035906 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 27 05:54:47.035926 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 27 05:54:47.035946 kernel: Spectre V2 : Mitigation: IBRS Jan 27 05:54:47.035979 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jan 27 05:54:47.036003 kernel: RETBleed: Mitigation: IBRS Jan 27 05:54:47.036022 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 27 05:54:47.036041 kernel: Spectre V2 : User space: Mitigation: STIBP via prctl Jan 27 05:54:47.036061 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 27 05:54:47.036079 kernel: MDS: Mitigation: Clear CPU buffers Jan 27 05:54:47.036098 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 27 05:54:47.036117 kernel: active return thunk: its_return_thunk Jan 27 05:54:47.036139 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 27 05:54:47.036159 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 27 05:54:47.036177 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 27 05:54:47.036196 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 27 05:54:47.036215 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 27 05:54:47.036242 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Jan 27 05:54:47.036261 kernel: Freeing SMP alternatives memory: 32K Jan 27 05:54:47.036284 kernel: pid_max: default: 32768 minimum: 301 Jan 27 05:54:47.036303 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 27 05:54:47.036323 kernel: landlock: Up and running. Jan 27 05:54:47.036342 kernel: SELinux: Initializing. Jan 27 05:54:47.036361 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 27 05:54:47.036398 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 27 05:54:47.036417 kernel: smpboot: CPU0: Intel(R) Xeon(R) CPU @ 2.30GHz (family: 0x6, model: 0x3f, stepping: 0x0) Jan 27 05:54:47.036436 kernel: Performance Events: unsupported p6 CPU model 63 no PMU driver, software events only. Jan 27 05:54:47.036459 kernel: signal: max sigframe size: 1776 Jan 27 05:54:47.036477 kernel: rcu: Hierarchical SRCU implementation. Jan 27 05:54:47.036496 kernel: rcu: Max phase no-delay instances is 400. Jan 27 05:54:47.036515 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 27 05:54:47.036533 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 27 05:54:47.036550 kernel: smp: Bringing up secondary CPUs ... Jan 27 05:54:47.036568 kernel: smpboot: x86: Booting SMP configuration: Jan 27 05:54:47.036590 kernel: .... node #0, CPUs: #1 Jan 27 05:54:47.036610 kernel: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Jan 27 05:54:47.036630 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Jan 27 05:54:47.036689 kernel: smp: Brought up 1 node, 2 CPUs Jan 27 05:54:47.036719 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) Jan 27 05:54:47.036739 kernel: Memory: 7580388K/7860544K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15540K init, 2496K bss, 274324K reserved, 0K cma-reserved) Jan 27 05:54:47.036764 kernel: devtmpfs: initialized Jan 27 05:54:47.036784 kernel: x86/mm: Memory block size: 128MB Jan 27 05:54:47.036803 kernel: ACPI: PM: Registering ACPI NVS region [mem 0xbfb7f000-0xbfbfefff] (524288 bytes) Jan 27 05:54:47.036835 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 27 05:54:47.036859 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 27 05:54:47.036879 kernel: pinctrl core: initialized pinctrl subsystem Jan 27 05:54:47.036897 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 27 05:54:47.036918 kernel: audit: initializing netlink subsys (disabled) Jan 27 05:54:47.036938 kernel: audit: type=2000 audit(1769493284.183:1): state=initialized audit_enabled=0 res=1 Jan 27 05:54:47.036957 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 27 05:54:47.036974 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 27 05:54:47.036991 kernel: cpuidle: using governor menu Jan 27 05:54:47.037010 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 27 05:54:47.037029 kernel: dca service started, version 1.12.1 Jan 27 05:54:47.037052 kernel: PCI: Using configuration type 1 for base access Jan 27 05:54:47.037072 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 27 05:54:47.037090 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 27 05:54:47.037109 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 27 05:54:47.037128 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 27 05:54:47.037147 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 27 05:54:47.037166 kernel: ACPI: Added _OSI(Module Device) Jan 27 05:54:47.037188 kernel: ACPI: Added _OSI(Processor Device) Jan 27 05:54:47.037207 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 27 05:54:47.037234 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Jan 27 05:54:47.037252 kernel: ACPI: Interpreter enabled Jan 27 05:54:47.037271 kernel: ACPI: PM: (supports S0 S3 S5) Jan 27 05:54:47.037290 kernel: ACPI: Using IOAPIC for interrupt routing Jan 27 05:54:47.037308 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 27 05:54:47.037327 kernel: PCI: Ignoring E820 reservations for host bridge windows Jan 27 05:54:47.037350 kernel: ACPI: Enabled 16 GPEs in block 00 to 0F Jan 27 05:54:47.037368 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 27 05:54:47.037734 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Jan 27 05:54:47.038009 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Jan 27 05:54:47.038292 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Jan 27 05:54:47.038324 kernel: PCI host bridge to bus 0000:00 Jan 27 05:54:47.038603 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 27 05:54:47.038851 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 27 05:54:47.039088 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 27 05:54:47.039330 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfefff window] Jan 27 05:54:47.039597 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 27 05:54:47.039908 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Jan 27 05:54:47.040189 kernel: pci 0000:00:01.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Jan 27 05:54:47.040558 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Jan 27 05:54:47.040905 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Jan 27 05:54:47.041182 kernel: pci 0000:00:03.0: [1af4:1004] type 00 class 0x000000 conventional PCI endpoint Jan 27 05:54:47.041502 kernel: pci 0000:00:03.0: BAR 0 [io 0xc040-0xc07f] Jan 27 05:54:47.041771 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc0001000-0xc000107f] Jan 27 05:54:47.042053 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Jan 27 05:54:47.042335 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc03f] Jan 27 05:54:47.042621 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc0000000-0xc000007f] Jan 27 05:54:47.042888 kernel: pci 0000:00:05.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Jan 27 05:54:47.043174 kernel: pci 0000:00:05.0: BAR 0 [io 0xc080-0xc09f] Jan 27 05:54:47.043475 kernel: pci 0000:00:05.0: BAR 1 [mem 0xc0002000-0xc000203f] Jan 27 05:54:47.043501 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 27 05:54:47.043521 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 27 05:54:47.043540 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 27 05:54:47.043559 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 27 05:54:47.043584 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Jan 27 05:54:47.043603 kernel: iommu: Default domain type: Translated Jan 27 05:54:47.043622 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 27 05:54:47.043642 kernel: efivars: Registered efivars operations Jan 27 05:54:47.043661 kernel: PCI: Using ACPI for IRQ routing Jan 27 05:54:47.043680 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 27 05:54:47.043700 kernel: e820: reserve RAM buffer [mem 0x00055000-0x0005ffff] Jan 27 05:54:47.043723 kernel: e820: reserve RAM buffer [mem 0x00098000-0x0009ffff] Jan 27 05:54:47.043741 kernel: e820: reserve RAM buffer [mem 0xbd2e5000-0xbfffffff] Jan 27 05:54:47.043760 kernel: e820: reserve RAM buffer [mem 0xbf8ed000-0xbfffffff] Jan 27 05:54:47.043778 kernel: e820: reserve RAM buffer [mem 0xbffe0000-0xbfffffff] Jan 27 05:54:47.043797 kernel: vgaarb: loaded Jan 27 05:54:47.043817 kernel: clocksource: Switched to clocksource kvm-clock Jan 27 05:54:47.043836 kernel: VFS: Disk quotas dquot_6.6.0 Jan 27 05:54:47.043855 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 27 05:54:47.043878 kernel: pnp: PnP ACPI init Jan 27 05:54:47.043897 kernel: pnp: PnP ACPI: found 7 devices Jan 27 05:54:47.043916 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 27 05:54:47.043935 kernel: NET: Registered PF_INET protocol family Jan 27 05:54:47.043954 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 27 05:54:47.043973 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Jan 27 05:54:47.043993 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 27 05:54:47.044016 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 27 05:54:47.044035 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 27 05:54:47.044055 kernel: TCP: Hash tables configured (established 65536 bind 65536) Jan 27 05:54:47.044074 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Jan 27 05:54:47.044093 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Jan 27 05:54:47.044112 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 27 05:54:47.044131 kernel: NET: Registered PF_XDP protocol family Jan 27 05:54:47.044401 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 27 05:54:47.044648 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 27 05:54:47.044885 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 27 05:54:47.045130 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfefff window] Jan 27 05:54:47.045440 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 27 05:54:47.045476 kernel: PCI: CLS 0 bytes, default 64 Jan 27 05:54:47.045497 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 27 05:54:47.045516 kernel: software IO TLB: mapped [mem 0x00000000b7f7f000-0x00000000bbf7f000] (64MB) Jan 27 05:54:47.045536 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 27 05:54:47.045556 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Jan 27 05:54:47.045577 kernel: clocksource: Switched to clocksource tsc Jan 27 05:54:47.045597 kernel: Initialise system trusted keyrings Jan 27 05:54:47.045621 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Jan 27 05:54:47.045642 kernel: Key type asymmetric registered Jan 27 05:54:47.045661 kernel: Asymmetric key parser 'x509' registered Jan 27 05:54:47.045680 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 27 05:54:47.045698 kernel: io scheduler mq-deadline registered Jan 27 05:54:47.045717 kernel: io scheduler kyber registered Jan 27 05:54:47.045736 kernel: io scheduler bfq registered Jan 27 05:54:47.045758 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 27 05:54:47.045777 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Jan 27 05:54:47.046185 kernel: virtio-pci 0000:00:03.0: virtio_pci: leaving for legacy driver Jan 27 05:54:47.046212 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 10 Jan 27 05:54:47.046498 kernel: virtio-pci 0000:00:04.0: virtio_pci: leaving for legacy driver Jan 27 05:54:47.046524 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Jan 27 05:54:47.046782 kernel: virtio-pci 0000:00:05.0: virtio_pci: leaving for legacy driver Jan 27 05:54:47.046811 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 27 05:54:47.046831 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 27 05:54:47.046851 kernel: 00:04: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Jan 27 05:54:47.046871 kernel: 00:05: ttyS2 at I/O 0x3e8 (irq = 6, base_baud = 115200) is a 16550A Jan 27 05:54:47.046890 kernel: 00:06: ttyS3 at I/O 0x2e8 (irq = 7, base_baud = 115200) is a 16550A Jan 27 05:54:47.047164 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x9009, rev-id 0) Jan 27 05:54:47.047190 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 27 05:54:47.047214 kernel: i8042: Warning: Keylock active Jan 27 05:54:47.047241 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 27 05:54:47.047260 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 27 05:54:47.047540 kernel: rtc_cmos 00:00: RTC can wake from S4 Jan 27 05:54:47.047786 kernel: rtc_cmos 00:00: registered as rtc0 Jan 27 05:54:47.048031 kernel: rtc_cmos 00:00: setting system clock to 2026-01-27T05:54:45 UTC (1769493285) Jan 27 05:54:47.048284 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Jan 27 05:54:47.048308 kernel: intel_pstate: CPU model not supported Jan 27 05:54:47.048328 kernel: pstore: Using crash dump compression: deflate Jan 27 05:54:47.048347 kernel: pstore: Registered efi_pstore as persistent store backend Jan 27 05:54:47.048366 kernel: NET: Registered PF_INET6 protocol family Jan 27 05:54:47.048398 kernel: Segment Routing with IPv6 Jan 27 05:54:47.048418 kernel: In-situ OAM (IOAM) with IPv6 Jan 27 05:54:47.048440 kernel: NET: Registered PF_PACKET protocol family Jan 27 05:54:47.048459 kernel: Key type dns_resolver registered Jan 27 05:54:47.048478 kernel: IPI shorthand broadcast: enabled Jan 27 05:54:47.048498 kernel: sched_clock: Marking stable (1893004612, 133264976)->(2032364552, -6094964) Jan 27 05:54:47.048517 kernel: registered taskstats version 1 Jan 27 05:54:47.048537 kernel: Loading compiled-in X.509 certificates Jan 27 05:54:47.048557 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 9e3db75de0fafb28d6cceb2e9f9c71b82c500cb9' Jan 27 05:54:47.048579 kernel: Demotion targets for Node 0: null Jan 27 05:54:47.048598 kernel: Key type .fscrypt registered Jan 27 05:54:47.048617 kernel: Key type fscrypt-provisioning registered Jan 27 05:54:47.048635 kernel: ima: Allocated hash algorithm: sha1 Jan 27 05:54:47.048654 kernel: ima: Can not allocate sha384 (reason: -2) Jan 27 05:54:47.048674 kernel: ima: No architecture policies found Jan 27 05:54:47.048694 kernel: clk: Disabling unused clocks Jan 27 05:54:47.048716 kernel: Freeing unused kernel image (initmem) memory: 15540K Jan 27 05:54:47.048736 kernel: Write protecting the kernel read-only data: 47104k Jan 27 05:54:47.048872 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 27 05:54:47.048892 kernel: Run /init as init process Jan 27 05:54:47.048919 kernel: with arguments: Jan 27 05:54:47.048937 kernel: /init Jan 27 05:54:47.048954 kernel: with environment: Jan 27 05:54:47.049088 kernel: HOME=/ Jan 27 05:54:47.049115 kernel: TERM=linux Jan 27 05:54:47.049134 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 27 05:54:47.049154 kernel: SCSI subsystem initialized Jan 27 05:54:47.049852 kernel: virtio_scsi virtio0: 1/0/0 default/read/poll queues Jan 27 05:54:47.050179 kernel: scsi host0: Virtio SCSI HBA Jan 27 05:54:47.050215 kernel: blk-mq: reduced tag depth to 10240 Jan 27 05:54:47.050557 kernel: scsi 0:0:1:0: Direct-Access Google PersistentDisk 1 PQ: 0 ANSI: 6 Jan 27 05:54:47.050866 kernel: sd 0:0:1:0: [sda] 33554432 512-byte logical blocks: (17.2 GB/16.0 GiB) Jan 27 05:54:47.052714 kernel: sd 0:0:1:0: [sda] 4096-byte physical blocks Jan 27 05:54:47.053036 kernel: sd 0:0:1:0: [sda] Write Protect is off Jan 27 05:54:47.053331 kernel: sd 0:0:1:0: [sda] Mode Sense: 1f 00 00 08 Jan 27 05:54:47.053648 kernel: sd 0:0:1:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 27 05:54:47.053701 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 27 05:54:47.053721 kernel: GPT:25804799 != 33554431 Jan 27 05:54:47.053742 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 27 05:54:47.053761 kernel: GPT:25804799 != 33554431 Jan 27 05:54:47.053781 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 27 05:54:47.053803 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 27 05:54:47.054105 kernel: sd 0:0:1:0: [sda] Attached SCSI disk Jan 27 05:54:47.054136 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 27 05:54:47.054159 kernel: device-mapper: uevent: version 1.0.3 Jan 27 05:54:47.054182 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 27 05:54:47.054205 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 27 05:54:47.054236 kernel: raid6: avx2x4 gen() 17882 MB/s Jan 27 05:54:47.054265 kernel: raid6: avx2x2 gen() 17767 MB/s Jan 27 05:54:47.054288 kernel: raid6: avx2x1 gen() 13688 MB/s Jan 27 05:54:47.054309 kernel: raid6: using algorithm avx2x4 gen() 17882 MB/s Jan 27 05:54:47.054332 kernel: raid6: .... xor() 7560 MB/s, rmw enabled Jan 27 05:54:47.054355 kernel: raid6: using avx2x2 recovery algorithm Jan 27 05:54:47.054377 kernel: xor: automatically using best checksumming function avx Jan 27 05:54:47.054419 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 27 05:54:47.054442 kernel: BTRFS: device fsid 8e29e710-4356-4007-b707-6ae7cc95ead5 devid 1 transid 35 /dev/mapper/usr (254:0) scanned by mount (155) Jan 27 05:54:47.054470 kernel: BTRFS info (device dm-0): first mount of filesystem 8e29e710-4356-4007-b707-6ae7cc95ead5 Jan 27 05:54:47.054493 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 27 05:54:47.054516 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 27 05:54:47.054538 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 27 05:54:47.054560 kernel: BTRFS info (device dm-0): enabling free space tree Jan 27 05:54:47.054583 kernel: loop: module loaded Jan 27 05:54:47.054605 kernel: loop0: detected capacity change from 0 to 100552 Jan 27 05:54:47.054632 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 27 05:54:47.054657 systemd[1]: Successfully made /usr/ read-only. Jan 27 05:54:47.054685 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 27 05:54:47.054709 systemd[1]: Detected virtualization google. Jan 27 05:54:47.054732 systemd[1]: Detected architecture x86-64. Jan 27 05:54:47.054758 systemd[1]: Running in initrd. Jan 27 05:54:47.054781 systemd[1]: No hostname configured, using default hostname. Jan 27 05:54:47.054804 systemd[1]: Hostname set to . Jan 27 05:54:47.054827 systemd[1]: Initializing machine ID from random generator. Jan 27 05:54:47.054851 systemd[1]: Queued start job for default target initrd.target. Jan 27 05:54:47.054874 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 27 05:54:47.054897 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 05:54:47.054924 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 05:54:47.054950 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 27 05:54:47.054972 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 27 05:54:47.054997 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 27 05:54:47.055021 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 27 05:54:47.055049 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 05:54:47.055072 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 27 05:54:47.055095 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 27 05:54:47.055120 systemd[1]: Reached target paths.target - Path Units. Jan 27 05:54:47.055148 systemd[1]: Reached target slices.target - Slice Units. Jan 27 05:54:47.055175 systemd[1]: Reached target swap.target - Swaps. Jan 27 05:54:47.055198 systemd[1]: Reached target timers.target - Timer Units. Jan 27 05:54:47.055229 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 27 05:54:47.055253 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 27 05:54:47.055277 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 27 05:54:47.055305 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 27 05:54:47.055333 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 27 05:54:47.055357 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 27 05:54:47.055446 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 27 05:54:47.055603 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 05:54:47.055629 systemd[1]: Reached target sockets.target - Socket Units. Jan 27 05:54:47.055653 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 27 05:54:47.055813 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 27 05:54:47.055843 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 27 05:54:47.055863 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 27 05:54:47.055883 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 27 05:54:47.056031 systemd[1]: Starting systemd-fsck-usr.service... Jan 27 05:54:47.056050 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 27 05:54:47.056070 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 27 05:54:47.056237 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 05:54:47.056260 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 27 05:54:47.056282 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 05:54:47.056303 systemd[1]: Finished systemd-fsck-usr.service. Jan 27 05:54:47.056448 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 27 05:54:47.056602 systemd-journald[292]: Collecting audit messages is enabled. Jan 27 05:54:47.056656 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 27 05:54:47.056683 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 27 05:54:47.056705 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 27 05:54:47.056726 kernel: Bridge firewalling registered Jan 27 05:54:47.056747 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 27 05:54:47.056769 kernel: audit: type=1130 audit(1769493287.044:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.056788 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 27 05:54:47.056809 systemd-journald[292]: Journal started Jan 27 05:54:47.056850 systemd-journald[292]: Runtime Journal (/run/log/journal/b24408b6df2a4bb6b96b68d13f57b339) is 8M, max 148.4M, 140.4M free. Jan 27 05:54:47.044000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.040927 systemd-modules-load[293]: Inserted module 'br_netfilter' Jan 27 05:54:47.061986 systemd[1]: Started systemd-journald.service - Journal Service. Jan 27 05:54:47.063000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.067415 kernel: audit: type=1130 audit(1769493287.063:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.069133 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 05:54:47.078518 kernel: audit: type=1130 audit(1769493287.071:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.071000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.075872 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 05:54:47.087526 kernel: audit: type=1130 audit(1769493287.080:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.080000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.087984 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 27 05:54:47.092750 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 27 05:54:47.098630 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 27 05:54:47.102000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.108424 kernel: audit: type=1130 audit(1769493287.102:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.110343 kernel: audit: type=1334 audit(1769493287.108:7): prog-id=6 op=LOAD Jan 27 05:54:47.108000 audit: BPF prog-id=6 op=LOAD Jan 27 05:54:47.110909 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 27 05:54:47.123133 systemd-tmpfiles[314]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 27 05:54:47.139751 kernel: audit: type=1130 audit(1769493287.134:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.134000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.135412 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 05:54:47.143019 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 27 05:54:47.150831 kernel: audit: type=1130 audit(1769493287.145:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.145000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.150953 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 27 05:54:47.183291 dracut-cmdline[331]: dracut-109 Jan 27 05:54:47.190545 dracut-cmdline[331]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=94a0aed2c135ea3629cf7bc829842658bafc4ce682f9974c582239b9a4f2cb9e Jan 27 05:54:47.218463 systemd-resolved[318]: Positive Trust Anchors: Jan 27 05:54:47.218483 systemd-resolved[318]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 27 05:54:47.218492 systemd-resolved[318]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 27 05:54:47.218565 systemd-resolved[318]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 27 05:54:47.262038 systemd-resolved[318]: Defaulting to hostname 'linux'. Jan 27 05:54:47.264255 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 27 05:54:47.272548 kernel: audit: type=1130 audit(1769493287.264:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.264000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.265904 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 27 05:54:47.344426 kernel: Loading iSCSI transport class v2.0-870. Jan 27 05:54:47.364420 kernel: iscsi: registered transport (tcp) Jan 27 05:54:47.395427 kernel: iscsi: registered transport (qla4xxx) Jan 27 05:54:47.395514 kernel: QLogic iSCSI HBA Driver Jan 27 05:54:47.428602 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 27 05:54:47.456953 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 05:54:47.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.459124 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 27 05:54:47.523699 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 27 05:54:47.522000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.526603 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 27 05:54:47.536558 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 27 05:54:47.576909 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 27 05:54:47.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.582000 audit: BPF prog-id=7 op=LOAD Jan 27 05:54:47.583000 audit: BPF prog-id=8 op=LOAD Jan 27 05:54:47.585512 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 05:54:47.635495 systemd-udevd[564]: Using default interface naming scheme 'v257'. Jan 27 05:54:47.658501 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 05:54:47.659000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.663948 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 27 05:54:47.701012 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 27 05:54:47.703000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.705000 audit: BPF prog-id=9 op=LOAD Jan 27 05:54:47.707630 dracut-pre-trigger[642]: rd.md=0: removing MD RAID activation Jan 27 05:54:47.709781 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 27 05:54:47.753688 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 27 05:54:47.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.761648 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 27 05:54:47.789032 systemd-networkd[672]: lo: Link UP Jan 27 05:54:47.789591 systemd-networkd[672]: lo: Gained carrier Jan 27 05:54:47.792145 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 27 05:54:47.799637 systemd[1]: Reached target network.target - Network. Jan 27 05:54:47.798000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.875786 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 05:54:47.894000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:47.898724 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 27 05:54:48.119926 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - PersistentDisk EFI-SYSTEM. Jan 27 05:54:48.158645 kernel: cryptd: max_cpu_qlen set to 1000 Jan 27 05:54:48.158691 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jan 27 05:54:48.175143 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - PersistentDisk ROOT. Jan 27 05:54:48.198418 kernel: AES CTR mode by8 optimization enabled Jan 27 05:54:48.226555 systemd-networkd[672]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 05:54:48.226569 systemd-networkd[672]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 27 05:54:48.228286 systemd-networkd[672]: eth0: Link UP Jan 27 05:54:48.228685 systemd-networkd[672]: eth0: Gained carrier Jan 27 05:54:48.228706 systemd-networkd[672]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 05:54:48.240465 systemd-networkd[672]: eth0: Overlong DHCP hostname received, shortened from 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf.c.flatcar-212911.internal' to 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:54:48.240484 systemd-networkd[672]: eth0: DHCPv4 address 10.128.0.23/32, gateway 10.128.0.1 acquired from 169.254.169.254 Jan 27 05:54:48.268433 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Jan 27 05:54:48.285833 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - PersistentDisk USR-A. Jan 27 05:54:48.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:48.302864 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 27 05:54:48.331740 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 27 05:54:48.410985 disk-uuid[794]: Primary Header is updated. Jan 27 05:54:48.410985 disk-uuid[794]: Secondary Entries is updated. Jan 27 05:54:48.410985 disk-uuid[794]: Secondary Header is updated. Jan 27 05:54:48.331974 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 05:54:48.360797 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 05:54:48.402768 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 05:54:48.511286 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 05:54:48.530000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:48.590775 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 27 05:54:48.589000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:48.592017 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 27 05:54:48.617642 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 05:54:48.627729 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 27 05:54:48.647244 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 27 05:54:48.700612 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 27 05:54:48.699000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.458130 disk-uuid[795]: Warning: The kernel is still using the old partition table. Jan 27 05:54:49.458130 disk-uuid[795]: The new table will be used at the next reboot or after you Jan 27 05:54:49.458130 disk-uuid[795]: run partprobe(8) or kpartx(8) Jan 27 05:54:49.458130 disk-uuid[795]: The operation has completed successfully. Jan 27 05:54:49.544513 kernel: kauditd_printk_skb: 15 callbacks suppressed Jan 27 05:54:49.544571 kernel: audit: type=1130 audit(1769493289.476:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.544602 kernel: audit: type=1131 audit(1769493289.476:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.476000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.470198 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 27 05:54:49.470338 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 27 05:54:49.480617 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 27 05:54:49.611626 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (824) Jan 27 05:54:49.611697 kernel: BTRFS info (device sda6): first mount of filesystem 3d9bae75-48f1-4a66-8ef3-32c49c69a6d1 Jan 27 05:54:49.611725 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 27 05:54:49.637903 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 27 05:54:49.637999 kernel: BTRFS info (device sda6): turning on async discard Jan 27 05:54:49.638026 kernel: BTRFS info (device sda6): enabling free space tree Jan 27 05:54:49.659419 kernel: BTRFS info (device sda6): last unmount of filesystem 3d9bae75-48f1-4a66-8ef3-32c49c69a6d1 Jan 27 05:54:49.660339 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 27 05:54:49.696563 kernel: audit: type=1130 audit(1769493289.659:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.659000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.664606 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 27 05:54:49.943653 ignition[843]: Ignition 2.24.0 Jan 27 05:54:49.945210 ignition[843]: Stage: fetch-offline Jan 27 05:54:49.986563 kernel: audit: type=1130 audit(1769493289.957:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.957000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.948117 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 27 05:54:49.945329 ignition[843]: no configs at "/usr/lib/ignition/base.d" Jan 27 05:54:49.958614 systemd-networkd[672]: eth0: Gained IPv6LL Jan 27 05:54:49.945357 ignition[843]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Jan 27 05:54:49.980971 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 27 05:54:49.945674 ignition[843]: parsed url from cmdline: "" Jan 27 05:54:50.026541 unknown[850]: fetched base config from "system" Jan 27 05:54:50.074552 kernel: audit: type=1130 audit(1769493290.046:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:50.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:49.945679 ignition[843]: no config URL provided Jan 27 05:54:50.026553 unknown[850]: fetched base config from "system" Jan 27 05:54:49.945803 ignition[843]: reading system config file "/usr/lib/ignition/user.ign" Jan 27 05:54:50.026562 unknown[850]: fetched user config from "gcp" Jan 27 05:54:49.945818 ignition[843]: no config at "/usr/lib/ignition/user.ign" Jan 27 05:54:50.029908 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 27 05:54:49.945827 ignition[843]: failed to fetch config: resource requires networking Jan 27 05:54:50.166530 kernel: audit: type=1130 audit(1769493290.136:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:50.136000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:50.050630 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 27 05:54:49.946085 ignition[843]: Ignition finished successfully Jan 27 05:54:50.126047 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 27 05:54:50.013938 ignition[850]: Ignition 2.24.0 Jan 27 05:54:50.140617 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 27 05:54:50.013946 ignition[850]: Stage: fetch Jan 27 05:54:50.014126 ignition[850]: no configs at "/usr/lib/ignition/base.d" Jan 27 05:54:50.254556 kernel: audit: type=1130 audit(1769493290.226:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:50.226000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:50.214359 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 27 05:54:50.014138 ignition[850]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Jan 27 05:54:50.229133 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 27 05:54:50.014254 ignition[850]: parsed url from cmdline: "" Jan 27 05:54:50.264709 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 27 05:54:50.014259 ignition[850]: no config URL provided Jan 27 05:54:50.283678 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 27 05:54:50.014271 ignition[850]: reading system config file "/usr/lib/ignition/user.ign" Jan 27 05:54:50.292749 systemd[1]: Reached target sysinit.target - System Initialization. Jan 27 05:54:50.014282 ignition[850]: no config at "/usr/lib/ignition/user.ign" Jan 27 05:54:50.318648 systemd[1]: Reached target basic.target - Basic System. Jan 27 05:54:50.014316 ignition[850]: GET http://169.254.169.254/computeMetadata/v1/instance/attributes/user-data: attempt #1 Jan 27 05:54:50.343230 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 27 05:54:50.016925 ignition[850]: GET result: OK Jan 27 05:54:50.017095 ignition[850]: parsing config with SHA512: 731372620807ce6fc0d3eb6ae37882f1ba463292f422e21f177b9ab280cd07819b88cb6ccb7213a2f3f52022cfc7dfa5df02d7c8bf378a6b72eafea70abdace0 Jan 27 05:54:50.027504 ignition[850]: fetch: fetch complete Jan 27 05:54:50.027511 ignition[850]: fetch: fetch passed Jan 27 05:54:50.027566 ignition[850]: Ignition finished successfully Jan 27 05:54:50.123427 ignition[856]: Ignition 2.24.0 Jan 27 05:54:50.123435 ignition[856]: Stage: kargs Jan 27 05:54:50.123605 ignition[856]: no configs at "/usr/lib/ignition/base.d" Jan 27 05:54:50.123616 ignition[856]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Jan 27 05:54:50.124496 ignition[856]: kargs: kargs passed Jan 27 05:54:50.124556 ignition[856]: Ignition finished successfully Jan 27 05:54:50.211502 ignition[862]: Ignition 2.24.0 Jan 27 05:54:50.211509 ignition[862]: Stage: disks Jan 27 05:54:50.211716 ignition[862]: no configs at "/usr/lib/ignition/base.d" Jan 27 05:54:50.211732 ignition[862]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Jan 27 05:54:50.212873 ignition[862]: disks: disks passed Jan 27 05:54:50.212934 ignition[862]: Ignition finished successfully Jan 27 05:54:50.413594 systemd-fsck[870]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 27 05:54:50.503445 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 27 05:54:50.543617 kernel: audit: type=1130 audit(1769493290.502:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:50.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:50.507550 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 27 05:54:50.736418 kernel: EXT4-fs (sda9): mounted filesystem a9099a9f-29a1-43d8-a05a-53a191872646 r/w with ordered data mode. Quota mode: none. Jan 27 05:54:50.739037 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 27 05:54:50.746856 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 27 05:54:50.764979 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 27 05:54:50.773175 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 27 05:54:50.785307 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 27 05:54:50.829541 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (878) Jan 27 05:54:50.829587 kernel: BTRFS info (device sda6): first mount of filesystem 3d9bae75-48f1-4a66-8ef3-32c49c69a6d1 Jan 27 05:54:50.829632 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 27 05:54:50.785361 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 27 05:54:50.785422 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 27 05:54:50.855062 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 27 05:54:50.855118 kernel: BTRFS info (device sda6): turning on async discard Jan 27 05:54:50.862407 kernel: BTRFS info (device sda6): enabling free space tree Jan 27 05:54:50.892451 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 27 05:54:50.899708 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 27 05:54:50.915908 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 27 05:54:51.246014 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 27 05:54:51.285596 kernel: audit: type=1130 audit(1769493291.255:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:51.255000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:51.259552 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 27 05:54:51.295977 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 27 05:54:51.328573 kernel: BTRFS info (device sda6): last unmount of filesystem 3d9bae75-48f1-4a66-8ef3-32c49c69a6d1 Jan 27 05:54:51.323200 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 27 05:54:51.366745 ignition[974]: INFO : Ignition 2.24.0 Jan 27 05:54:51.366745 ignition[974]: INFO : Stage: mount Jan 27 05:54:51.379542 ignition[974]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 05:54:51.379542 ignition[974]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Jan 27 05:54:51.379542 ignition[974]: INFO : mount: mount passed Jan 27 05:54:51.379542 ignition[974]: INFO : Ignition finished successfully Jan 27 05:54:51.452495 kernel: audit: type=1130 audit(1769493291.387:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:51.387000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:51.424000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:51.370859 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 27 05:54:51.391463 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 27 05:54:51.428053 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 27 05:54:51.470023 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 27 05:54:51.528409 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (986) Jan 27 05:54:51.545789 kernel: BTRFS info (device sda6): first mount of filesystem 3d9bae75-48f1-4a66-8ef3-32c49c69a6d1 Jan 27 05:54:51.545864 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 27 05:54:51.561370 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 27 05:54:51.561452 kernel: BTRFS info (device sda6): turning on async discard Jan 27 05:54:51.561478 kernel: BTRFS info (device sda6): enabling free space tree Jan 27 05:54:51.569837 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 27 05:54:51.611527 ignition[1003]: INFO : Ignition 2.24.0 Jan 27 05:54:51.611527 ignition[1003]: INFO : Stage: files Jan 27 05:54:51.623570 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 05:54:51.623570 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Jan 27 05:54:51.623570 ignition[1003]: DEBUG : files: compiled without relabeling support, skipping Jan 27 05:54:51.623570 ignition[1003]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 27 05:54:51.623570 ignition[1003]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 27 05:54:51.623570 ignition[1003]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 27 05:54:51.623570 ignition[1003]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 27 05:54:51.623570 ignition[1003]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 27 05:54:51.622450 unknown[1003]: wrote ssh authorized keys file for user: core Jan 27 05:54:51.715494 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 27 05:54:51.715494 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 27 05:54:51.750911 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 27 05:54:51.927431 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 27 05:54:51.927431 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 27 05:54:51.957545 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 27 05:54:52.313804 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 27 05:54:52.707161 ignition[1003]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 27 05:54:52.707161 ignition[1003]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 27 05:54:52.732000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:52.744754 ignition[1003]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 27 05:54:52.744754 ignition[1003]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 27 05:54:52.744754 ignition[1003]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 27 05:54:52.744754 ignition[1003]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 27 05:54:52.744754 ignition[1003]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 27 05:54:52.744754 ignition[1003]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 27 05:54:52.744754 ignition[1003]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 27 05:54:52.744754 ignition[1003]: INFO : files: files passed Jan 27 05:54:52.744754 ignition[1003]: INFO : Ignition finished successfully Jan 27 05:54:52.824000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:52.824000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:52.851000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:52.718340 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 27 05:54:52.735172 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 27 05:54:52.754812 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 27 05:54:52.770124 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 27 05:54:52.946540 initrd-setup-root-after-ignition[1033]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 27 05:54:52.946540 initrd-setup-root-after-ignition[1033]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 27 05:54:52.770249 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 27 05:54:52.981639 initrd-setup-root-after-ignition[1037]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 27 05:54:52.990000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:52.990000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:52.842038 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 27 05:54:52.852953 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 27 05:54:52.877808 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 27 05:54:52.978998 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 27 05:54:52.979129 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 27 05:54:52.991740 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 27 05:54:53.013530 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 27 05:54:53.032036 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 27 05:54:53.033432 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 27 05:54:53.126029 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 27 05:54:53.135000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.138794 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 27 05:54:53.186782 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 27 05:54:53.187284 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 27 05:54:53.197836 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 05:54:53.215917 systemd[1]: Stopped target timers.target - Timer Units. Jan 27 05:54:53.233921 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 27 05:54:53.246000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.234118 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 27 05:54:53.264922 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 27 05:54:53.274866 systemd[1]: Stopped target basic.target - Basic System. Jan 27 05:54:53.290869 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 27 05:54:53.304883 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 27 05:54:53.321926 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 27 05:54:53.340905 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 27 05:54:53.357881 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 27 05:54:53.374892 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 27 05:54:53.390894 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 27 05:54:53.410910 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 27 05:54:53.426872 systemd[1]: Stopped target swap.target - Swaps. Jan 27 05:54:53.442791 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 27 05:54:53.454000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.443001 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 27 05:54:53.471903 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 27 05:54:53.480867 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 05:54:53.497812 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 27 05:54:53.498010 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 05:54:53.515813 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 27 05:54:53.551000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.516015 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 27 05:54:53.561884 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 27 05:54:53.569000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.562110 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 27 05:54:53.587000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.570900 systemd[1]: ignition-files.service: Deactivated successfully. Jan 27 05:54:53.623000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.571082 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 27 05:54:53.647000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.590301 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 27 05:54:53.666951 ignition[1058]: INFO : Ignition 2.24.0 Jan 27 05:54:53.666951 ignition[1058]: INFO : Stage: umount Jan 27 05:54:53.666951 ignition[1058]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 05:54:53.666951 ignition[1058]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Jan 27 05:54:53.666951 ignition[1058]: INFO : umount: umount passed Jan 27 05:54:53.666951 ignition[1058]: INFO : Ignition finished successfully Jan 27 05:54:53.675000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.687000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.699000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.732000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.755000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.613512 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 27 05:54:53.771000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.613822 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 05:54:53.627721 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 27 05:54:53.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.638502 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 27 05:54:53.638812 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 05:54:53.648843 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 27 05:54:53.649021 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 05:54:53.676798 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 27 05:54:53.677003 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 27 05:54:53.698371 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 27 05:54:53.699739 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 27 05:54:53.939000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.699855 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 27 05:54:53.955000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.701183 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 27 05:54:53.973000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.701299 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 27 05:54:53.718856 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 27 05:54:53.719043 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 27 05:54:53.733802 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 27 05:54:54.031000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.733862 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 27 05:54:54.047000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.756566 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 27 05:54:54.066000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:54.066000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:54.067000 audit: BPF prog-id=6 op=UNLOAD Jan 27 05:54:54.069000 audit: BPF prog-id=9 op=UNLOAD Jan 27 05:54:53.756673 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 27 05:54:53.772644 systemd[1]: Stopped target network.target - Network. Jan 27 05:54:53.786498 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 27 05:54:53.786594 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 27 05:54:53.804555 systemd[1]: Stopped target paths.target - Path Units. Jan 27 05:54:54.124000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.818484 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 27 05:54:54.158000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.820466 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 05:54:54.176000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.837505 systemd[1]: Stopped target slices.target - Slice Units. Jan 27 05:54:53.854593 systemd[1]: Stopped target sockets.target - Socket Units. Jan 27 05:54:54.215000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.870610 systemd[1]: iscsid.socket: Deactivated successfully. Jan 27 05:54:53.870703 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 27 05:54:53.888613 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 27 05:54:54.254000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.888691 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 27 05:54:53.906596 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 27 05:54:54.286000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.906651 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 27 05:54:53.924553 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 27 05:54:54.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.924673 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 27 05:54:53.940604 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 27 05:54:53.940711 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 27 05:54:54.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:54.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.956616 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 27 05:54:54.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.956713 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 27 05:54:54.402000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.974719 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 27 05:54:54.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:54.421000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:53.998608 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 27 05:54:54.015345 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 27 05:54:54.015507 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 27 05:54:54.034440 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 27 05:54:54.034571 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 27 05:54:54.051116 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 27 05:54:54.051249 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 27 05:54:54.072015 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 27 05:54:54.083581 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 27 05:54:54.533518 systemd-journald[292]: Received SIGTERM from PID 1 (systemd). Jan 27 05:54:54.083641 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 27 05:54:54.094748 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 27 05:54:54.102687 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 27 05:54:54.102769 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 27 05:54:54.125870 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 27 05:54:54.125932 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 27 05:54:54.159702 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 27 05:54:54.159775 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 27 05:54:54.177719 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 05:54:54.200127 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 27 05:54:54.200291 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 05:54:54.219220 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 27 05:54:54.219332 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 27 05:54:54.223931 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 27 05:54:54.223975 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 05:54:54.239726 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 27 05:54:54.239792 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 27 05:54:54.271747 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 27 05:54:54.271856 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 27 05:54:54.296715 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 27 05:54:54.296928 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 27 05:54:54.323974 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 27 05:54:54.339480 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 27 05:54:54.339592 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 05:54:54.356604 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 27 05:54:54.356716 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 05:54:54.366721 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 27 05:54:54.366815 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 05:54:54.386652 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 27 05:54:54.386771 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 27 05:54:54.403987 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 27 05:54:54.404103 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 27 05:54:54.423725 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 27 05:54:54.441759 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 27 05:54:54.481011 systemd[1]: Switching root. Jan 27 05:54:54.854476 systemd-journald[292]: Journal stopped Jan 27 05:54:57.521974 kernel: kauditd_printk_skb: 45 callbacks suppressed Jan 27 05:54:57.522028 kernel: audit: type=1335 audit(1769493294.863:81): pid=292 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=kernel comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" nl-mcgrp=1 op=disconnect res=1 Jan 27 05:54:57.522059 kernel: SELinux: policy capability network_peer_controls=1 Jan 27 05:54:57.522078 kernel: SELinux: policy capability open_perms=1 Jan 27 05:54:57.522095 kernel: SELinux: policy capability extended_socket_class=1 Jan 27 05:54:57.522113 kernel: SELinux: policy capability always_check_network=0 Jan 27 05:54:57.522132 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 27 05:54:57.522157 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 27 05:54:57.522178 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 27 05:54:57.522297 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 27 05:54:57.522332 kernel: SELinux: policy capability userspace_initial_context=0 Jan 27 05:54:57.522346 kernel: audit: type=1403 audit(1769493295.142:82): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 27 05:54:57.522361 systemd[1]: Successfully loaded SELinux policy in 121.295ms. Jan 27 05:54:57.522426 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.515ms. Jan 27 05:54:57.522452 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 27 05:54:57.522473 systemd[1]: Detected virtualization google. Jan 27 05:54:57.522492 systemd[1]: Detected architecture x86-64. Jan 27 05:54:57.522517 systemd[1]: Detected first boot. Jan 27 05:54:57.522539 systemd[1]: Initializing machine ID from random generator. Jan 27 05:54:57.522559 kernel: audit: type=1334 audit(1769493295.329:83): prog-id=10 op=LOAD Jan 27 05:54:57.522578 kernel: audit: type=1334 audit(1769493295.329:84): prog-id=10 op=UNLOAD Jan 27 05:54:57.522598 kernel: audit: type=1334 audit(1769493295.329:85): prog-id=11 op=LOAD Jan 27 05:54:57.522617 kernel: audit: type=1334 audit(1769493295.329:86): prog-id=11 op=UNLOAD Jan 27 05:54:57.522642 zram_generator::config[1100]: No configuration found. Jan 27 05:54:57.522663 kernel: Guest personality initialized and is inactive Jan 27 05:54:57.522683 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 27 05:54:57.522702 kernel: Initialized host personality Jan 27 05:54:57.522722 kernel: NET: Registered PF_VSOCK protocol family Jan 27 05:54:57.522741 systemd[1]: Populated /etc with preset unit settings. Jan 27 05:54:57.522765 kernel: audit: type=1334 audit(1769493296.202:87): prog-id=12 op=LOAD Jan 27 05:54:57.522784 kernel: audit: type=1334 audit(1769493296.202:88): prog-id=3 op=UNLOAD Jan 27 05:54:57.522805 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 27 05:54:57.522828 kernel: audit: type=1334 audit(1769493296.203:89): prog-id=13 op=LOAD Jan 27 05:54:57.522850 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 27 05:54:57.522871 kernel: audit: type=1334 audit(1769493296.203:90): prog-id=14 op=LOAD Jan 27 05:54:57.522892 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 27 05:54:57.522925 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 27 05:54:57.522947 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 27 05:54:57.522969 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 27 05:54:57.523032 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 27 05:54:57.523068 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 27 05:54:57.523092 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 27 05:54:57.523120 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 27 05:54:57.523145 systemd[1]: Created slice user.slice - User and Session Slice. Jan 27 05:54:57.523167 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 05:54:57.523192 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 05:54:57.523217 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 27 05:54:57.523240 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 27 05:54:57.523265 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 27 05:54:57.523296 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 27 05:54:57.523321 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 27 05:54:57.523346 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 05:54:57.523371 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 27 05:54:57.523415 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 27 05:54:57.523445 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 27 05:54:57.523469 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 27 05:54:57.523494 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 27 05:54:57.523527 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 05:54:57.523552 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 27 05:54:57.523577 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 27 05:54:57.523598 systemd[1]: Reached target slices.target - Slice Units. Jan 27 05:54:57.523623 systemd[1]: Reached target swap.target - Swaps. Jan 27 05:54:57.523645 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 27 05:54:57.523668 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 27 05:54:57.523692 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 27 05:54:57.523714 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 27 05:54:57.523741 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 27 05:54:57.523764 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 27 05:54:57.523788 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 27 05:54:57.523813 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 27 05:54:57.523840 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 27 05:54:57.523864 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 05:54:57.523890 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 27 05:54:57.523913 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 27 05:54:57.523938 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 27 05:54:57.523960 systemd[1]: Mounting media.mount - External Media Directory... Jan 27 05:54:57.523990 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:54:57.524006 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 27 05:54:57.524064 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 27 05:54:57.524082 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 27 05:54:57.524098 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 27 05:54:57.524114 systemd[1]: Reached target machines.target - Containers. Jan 27 05:54:57.524466 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 27 05:54:57.524496 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 05:54:57.524520 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 27 05:54:57.524551 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 27 05:54:57.524575 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 05:54:57.524599 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 27 05:54:57.524624 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 05:54:57.524648 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 27 05:54:57.524673 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 05:54:57.524698 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 27 05:54:57.524726 kernel: fuse: init (API version 7.41) Jan 27 05:54:57.524750 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 27 05:54:57.524775 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 27 05:54:57.524799 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 27 05:54:57.524824 systemd[1]: Stopped systemd-fsck-usr.service. Jan 27 05:54:57.524849 kernel: ACPI: bus type drm_connector registered Jan 27 05:54:57.524873 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 05:54:57.524901 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 27 05:54:57.524924 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 27 05:54:57.524957 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 27 05:54:57.524984 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 27 05:54:57.525008 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 27 05:54:57.532371 systemd-journald[1189]: Collecting audit messages is enabled. Jan 27 05:54:57.532463 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 27 05:54:57.532490 systemd-journald[1189]: Journal started Jan 27 05:54:57.532537 systemd-journald[1189]: Runtime Journal (/run/log/journal/4eb6b01addc04a768a17083b14035def) is 8M, max 148.4M, 140.4M free. Jan 27 05:54:56.802000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 27 05:54:57.361000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.386000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.408000 audit: BPF prog-id=14 op=UNLOAD Jan 27 05:54:57.408000 audit: BPF prog-id=13 op=UNLOAD Jan 27 05:54:57.409000 audit: BPF prog-id=15 op=LOAD Jan 27 05:54:57.409000 audit: BPF prog-id=16 op=LOAD Jan 27 05:54:57.409000 audit: BPF prog-id=17 op=LOAD Jan 27 05:54:57.514000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 27 05:54:57.514000 audit[1189]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=4 a1=7ffdcc543280 a2=4000 a3=0 items=0 ppid=1 pid=1189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:54:57.514000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 27 05:54:56.179828 systemd[1]: Queued start job for default target multi-user.target. Jan 27 05:54:56.205470 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 27 05:54:56.207248 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 27 05:54:57.561417 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:54:57.571430 systemd[1]: Started systemd-journald.service - Journal Service. Jan 27 05:54:57.579000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.583175 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 27 05:54:57.592733 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 27 05:54:57.601732 systemd[1]: Mounted media.mount - External Media Directory. Jan 27 05:54:57.610741 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 27 05:54:57.619718 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 27 05:54:57.628704 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 27 05:54:57.638040 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 27 05:54:57.647000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.649169 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 05:54:57.659000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.660959 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 27 05:54:57.661174 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 27 05:54:57.670000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.670000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.671921 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 05:54:57.672199 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 05:54:57.681000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.681000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.683052 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 27 05:54:57.683333 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 27 05:54:57.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.691000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.692894 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 05:54:57.693166 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 05:54:57.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.703875 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 27 05:54:57.704135 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 27 05:54:57.711000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.711000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.712891 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 05:54:57.713153 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 05:54:57.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.722000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.723943 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 27 05:54:57.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.735009 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 05:54:57.744000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.746896 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 27 05:54:57.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.757970 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 27 05:54:57.767000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.769836 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 05:54:57.778000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.792728 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 27 05:54:57.803645 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 27 05:54:57.814853 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 27 05:54:57.831540 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 27 05:54:57.840507 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 27 05:54:57.840696 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 27 05:54:57.851179 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 27 05:54:57.861706 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 05:54:57.861927 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 05:54:57.865377 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 27 05:54:57.885795 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 27 05:54:57.895699 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 27 05:54:57.903711 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 27 05:54:57.912242 systemd-journald[1189]: Time spent on flushing to /var/log/journal/4eb6b01addc04a768a17083b14035def is 66.130ms for 1085 entries. Jan 27 05:54:57.912242 systemd-journald[1189]: System Journal (/var/log/journal/4eb6b01addc04a768a17083b14035def) is 8M, max 588.1M, 580.1M free. Jan 27 05:54:58.005359 systemd-journald[1189]: Received client request to flush runtime journal. Jan 27 05:54:57.999000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:57.921905 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 27 05:54:57.924581 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 27 05:54:57.939759 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 27 05:54:57.952627 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 27 05:54:57.965908 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 27 05:54:57.979252 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 27 05:54:57.990429 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 27 05:54:58.003640 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 27 05:54:58.019420 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 27 05:54:58.025428 kernel: loop1: detected capacity change from 0 to 55000 Jan 27 05:54:58.037659 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 27 05:54:58.047000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:58.049291 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 27 05:54:58.057000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:58.070036 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 27 05:54:58.078000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:58.080311 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 27 05:54:58.089000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:58.092000 audit: BPF prog-id=18 op=LOAD Jan 27 05:54:58.101974 kernel: loop2: detected capacity change from 0 to 111560 Jan 27 05:54:58.099000 audit: BPF prog-id=19 op=LOAD Jan 27 05:54:58.099000 audit: BPF prog-id=20 op=LOAD Jan 27 05:54:58.103183 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 27 05:54:58.113000 audit: BPF prog-id=21 op=LOAD Jan 27 05:54:58.116321 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 27 05:54:58.128762 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 27 05:54:58.143000 audit: BPF prog-id=22 op=LOAD Jan 27 05:54:58.143000 audit: BPF prog-id=23 op=LOAD Jan 27 05:54:58.143000 audit: BPF prog-id=24 op=LOAD Jan 27 05:54:58.146753 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 27 05:54:58.156000 audit: BPF prog-id=25 op=LOAD Jan 27 05:54:58.156000 audit: BPF prog-id=26 op=LOAD Jan 27 05:54:58.156000 audit: BPF prog-id=27 op=LOAD Jan 27 05:54:58.161439 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 27 05:54:58.181054 kernel: loop3: detected capacity change from 0 to 50784 Jan 27 05:54:58.246418 kernel: loop4: detected capacity change from 0 to 224512 Jan 27 05:54:58.255512 systemd-tmpfiles[1244]: ACLs are not supported, ignoring. Jan 27 05:54:58.255547 systemd-tmpfiles[1244]: ACLs are not supported, ignoring. Jan 27 05:54:58.272000 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 05:54:58.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:58.307993 systemd-nsresourced[1246]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 27 05:54:58.311217 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 27 05:54:58.322000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:58.323683 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 27 05:54:58.334415 kernel: loop5: detected capacity change from 0 to 55000 Jan 27 05:54:58.337000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:58.368418 kernel: loop6: detected capacity change from 0 to 111560 Jan 27 05:54:58.400442 kernel: loop7: detected capacity change from 0 to 50784 Jan 27 05:54:58.439408 kernel: loop1: detected capacity change from 0 to 224512 Jan 27 05:54:58.474337 (sd-merge)[1263]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-gce.raw'. Jan 27 05:54:58.486180 (sd-merge)[1263]: Merged extensions into '/usr'. Jan 27 05:54:58.494091 systemd-oomd[1241]: No swap; memory pressure usage will be degraded Jan 27 05:54:58.494564 systemd[1]: Reload requested from client PID 1224 ('systemd-sysext') (unit systemd-sysext.service)... Jan 27 05:54:58.494586 systemd[1]: Reloading... Jan 27 05:54:58.600943 systemd-resolved[1243]: Positive Trust Anchors: Jan 27 05:54:58.600968 systemd-resolved[1243]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 27 05:54:58.600977 systemd-resolved[1243]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 27 05:54:58.601051 systemd-resolved[1243]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 27 05:54:58.626155 systemd-resolved[1243]: Defaulting to hostname 'linux'. Jan 27 05:54:58.667044 zram_generator::config[1293]: No configuration found. Jan 27 05:54:59.082266 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 27 05:54:59.082992 systemd[1]: Reloading finished in 587 ms. Jan 27 05:54:59.107014 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 27 05:54:59.116000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.118061 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 27 05:54:59.126000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.128179 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 27 05:54:59.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.143296 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 27 05:54:59.165138 systemd[1]: Starting ensure-sysext.service... Jan 27 05:54:59.182039 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 27 05:54:59.193000 audit: BPF prog-id=28 op=LOAD Jan 27 05:54:59.193000 audit: BPF prog-id=21 op=UNLOAD Jan 27 05:54:59.194000 audit: BPF prog-id=29 op=LOAD Jan 27 05:54:59.194000 audit: BPF prog-id=22 op=UNLOAD Jan 27 05:54:59.194000 audit: BPF prog-id=30 op=LOAD Jan 27 05:54:59.194000 audit: BPF prog-id=31 op=LOAD Jan 27 05:54:59.194000 audit: BPF prog-id=23 op=UNLOAD Jan 27 05:54:59.194000 audit: BPF prog-id=24 op=UNLOAD Jan 27 05:54:59.202000 audit: BPF prog-id=32 op=LOAD Jan 27 05:54:59.202000 audit: BPF prog-id=15 op=UNLOAD Jan 27 05:54:59.202000 audit: BPF prog-id=33 op=LOAD Jan 27 05:54:59.202000 audit: BPF prog-id=34 op=LOAD Jan 27 05:54:59.202000 audit: BPF prog-id=16 op=UNLOAD Jan 27 05:54:59.202000 audit: BPF prog-id=17 op=UNLOAD Jan 27 05:54:59.204000 audit: BPF prog-id=35 op=LOAD Jan 27 05:54:59.204000 audit: BPF prog-id=18 op=UNLOAD Jan 27 05:54:59.204000 audit: BPF prog-id=36 op=LOAD Jan 27 05:54:59.204000 audit: BPF prog-id=37 op=LOAD Jan 27 05:54:59.204000 audit: BPF prog-id=19 op=UNLOAD Jan 27 05:54:59.204000 audit: BPF prog-id=20 op=UNLOAD Jan 27 05:54:59.206000 audit: BPF prog-id=38 op=LOAD Jan 27 05:54:59.206000 audit: BPF prog-id=25 op=UNLOAD Jan 27 05:54:59.206000 audit: BPF prog-id=39 op=LOAD Jan 27 05:54:59.206000 audit: BPF prog-id=40 op=LOAD Jan 27 05:54:59.206000 audit: BPF prog-id=26 op=UNLOAD Jan 27 05:54:59.206000 audit: BPF prog-id=27 op=UNLOAD Jan 27 05:54:59.218653 systemd[1]: Reload requested from client PID 1336 ('systemctl') (unit ensure-sysext.service)... Jan 27 05:54:59.218679 systemd[1]: Reloading... Jan 27 05:54:59.238926 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 27 05:54:59.239502 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 27 05:54:59.240469 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 27 05:54:59.242183 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Jan 27 05:54:59.242305 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Jan 27 05:54:59.252779 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Jan 27 05:54:59.252799 systemd-tmpfiles[1337]: Skipping /boot Jan 27 05:54:59.270881 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Jan 27 05:54:59.271023 systemd-tmpfiles[1337]: Skipping /boot Jan 27 05:54:59.362422 zram_generator::config[1369]: No configuration found. Jan 27 05:54:59.590279 systemd[1]: Reloading finished in 370 ms. Jan 27 05:54:59.614914 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 27 05:54:59.624000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.628000 audit: BPF prog-id=41 op=LOAD Jan 27 05:54:59.628000 audit: BPF prog-id=28 op=UNLOAD Jan 27 05:54:59.629000 audit: BPF prog-id=42 op=LOAD Jan 27 05:54:59.629000 audit: BPF prog-id=35 op=UNLOAD Jan 27 05:54:59.630000 audit: BPF prog-id=43 op=LOAD Jan 27 05:54:59.630000 audit: BPF prog-id=44 op=LOAD Jan 27 05:54:59.630000 audit: BPF prog-id=36 op=UNLOAD Jan 27 05:54:59.630000 audit: BPF prog-id=37 op=UNLOAD Jan 27 05:54:59.631000 audit: BPF prog-id=45 op=LOAD Jan 27 05:54:59.631000 audit: BPF prog-id=38 op=UNLOAD Jan 27 05:54:59.631000 audit: BPF prog-id=46 op=LOAD Jan 27 05:54:59.632000 audit: BPF prog-id=47 op=LOAD Jan 27 05:54:59.632000 audit: BPF prog-id=39 op=UNLOAD Jan 27 05:54:59.632000 audit: BPF prog-id=40 op=UNLOAD Jan 27 05:54:59.633000 audit: BPF prog-id=48 op=LOAD Jan 27 05:54:59.633000 audit: BPF prog-id=32 op=UNLOAD Jan 27 05:54:59.633000 audit: BPF prog-id=49 op=LOAD Jan 27 05:54:59.633000 audit: BPF prog-id=50 op=LOAD Jan 27 05:54:59.633000 audit: BPF prog-id=33 op=UNLOAD Jan 27 05:54:59.633000 audit: BPF prog-id=34 op=UNLOAD Jan 27 05:54:59.634000 audit: BPF prog-id=51 op=LOAD Jan 27 05:54:59.634000 audit: BPF prog-id=29 op=UNLOAD Jan 27 05:54:59.635000 audit: BPF prog-id=52 op=LOAD Jan 27 05:54:59.635000 audit: BPF prog-id=53 op=LOAD Jan 27 05:54:59.635000 audit: BPF prog-id=30 op=UNLOAD Jan 27 05:54:59.635000 audit: BPF prog-id=31 op=UNLOAD Jan 27 05:54:59.649567 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 05:54:59.658000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.670238 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 27 05:54:59.685704 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 27 05:54:59.698600 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 27 05:54:59.710804 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 27 05:54:59.720000 audit: BPF prog-id=8 op=UNLOAD Jan 27 05:54:59.720000 audit: BPF prog-id=7 op=UNLOAD Jan 27 05:54:59.721000 audit: BPF prog-id=54 op=LOAD Jan 27 05:54:59.721000 audit: BPF prog-id=55 op=LOAD Jan 27 05:54:59.724192 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 05:54:59.737913 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 27 05:54:59.756084 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:54:59.757222 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 05:54:59.761414 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 05:54:59.786937 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 05:54:59.793000 audit[1429]: SYSTEM_BOOT pid=1429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.804716 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 05:54:59.813967 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 05:54:59.814359 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 05:54:59.814552 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 05:54:59.814842 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:54:59.840222 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 05:54:59.841284 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 05:54:59.849000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.851942 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 05:54:59.852251 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 05:54:59.861000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.861000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:54:59.867834 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:54:59.869149 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 05:54:59.869528 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 05:54:59.869886 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 05:54:59.870081 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 05:54:59.877338 kernel: kauditd_printk_skb: 121 callbacks suppressed Jan 27 05:54:59.877431 kernel: audit: type=1305 audit(1769493299.869:210): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 27 05:54:59.869000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 27 05:54:59.871221 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 27 05:54:59.877637 augenrules[1439]: No rules Jan 27 05:54:59.871575 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:54:59.873658 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 27 05:54:59.869000 audit[1439]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd8fedba30 a2=420 a3=0 items=0 ppid=1411 pid=1439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:54:59.892233 systemd[1]: audit-rules.service: Deactivated successfully. Jan 27 05:54:59.894463 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 27 05:54:59.909708 systemd-udevd[1424]: Using default interface naming scheme 'v257'. Jan 27 05:54:59.922730 kernel: audit: type=1300 audit(1769493299.869:210): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd8fedba30 a2=420 a3=0 items=0 ppid=1411 pid=1439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:54:59.869000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:54:59.942957 kernel: audit: type=1327 audit(1769493299.869:210): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:54:59.964072 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 05:54:59.964665 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 05:54:59.976433 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 27 05:55:00.005270 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:55:00.009443 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 27 05:55:00.016827 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 05:55:00.020597 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 05:55:00.023510 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 27 05:55:00.041901 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 05:55:00.058032 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 05:55:00.076380 systemd[1]: Starting setup-oem.service - Setup OEM... Jan 27 05:55:00.083737 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 05:55:00.084156 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 05:55:00.084357 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 05:55:00.084691 systemd[1]: Reached target time-set.target - System Time Set. Jan 27 05:55:00.093869 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 27 05:55:00.097803 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 05:55:00.111493 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 27 05:55:00.122360 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 05:55:00.122733 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 05:55:00.134733 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 27 05:55:00.135699 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 27 05:55:00.142283 augenrules[1450]: /sbin/augenrules: No change Jan 27 05:55:00.169267 systemd[1]: Finished ensure-sysext.service. Jan 27 05:55:00.191352 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 27 05:55:00.200574 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 27 05:55:00.201459 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 05:55:00.202602 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 05:55:00.214547 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 05:55:00.214910 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 05:55:00.223000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 05:55:00.245565 kernel: audit: type=1305 audit(1769493300.223:211): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 05:55:00.257528 systemd[1]: Condition check resulted in dev-tpmrm0.device - /dev/tpmrm0 being skipped. Jan 27 05:55:00.261097 augenrules[1500]: No rules Jan 27 05:55:00.259220 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Jan 27 05:55:00.294549 kernel: audit: type=1300 audit(1769493300.223:211): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff6d7212a0 a2=420 a3=0 items=0 ppid=1450 pid=1500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:00.223000 audit[1500]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff6d7212a0 a2=420 a3=0 items=0 ppid=1450 pid=1500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:00.223000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:55:00.309409 kernel: audit: type=1327 audit(1769493300.223:211): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:55:00.258000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 27 05:55:00.311799 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 27 05:55:00.311973 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 27 05:55:00.312970 systemd[1]: audit-rules.service: Deactivated successfully. Jan 27 05:55:00.313394 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 27 05:55:00.327149 kernel: audit: type=1305 audit(1769493300.258:212): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 27 05:55:00.258000 audit[1500]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff6d723730 a2=420 a3=0 items=0 ppid=1450 pid=1500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:00.360581 kernel: audit: type=1300 audit(1769493300.258:212): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff6d723730 a2=420 a3=0 items=0 ppid=1450 pid=1500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:00.360713 kernel: audit: type=1327 audit(1769493300.258:212): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:55:00.258000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:55:00.379123 systemd[1]: Finished setup-oem.service - Setup OEM. Jan 27 05:55:00.419267 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 27 05:55:00.430743 systemd[1]: Starting oem-gce-enable-oslogin.service - Enable GCE OS Login... Jan 27 05:55:00.452479 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Jan 27 05:55:00.483450 kernel: mousedev: PS/2 mouse device common for all mice Jan 27 05:55:00.544313 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 27 05:55:00.588516 systemd[1]: Finished oem-gce-enable-oslogin.service - Enable GCE OS Login. Jan 27 05:55:00.613529 kernel: ACPI: button: Power Button [PWRF] Jan 27 05:55:00.640053 systemd-networkd[1499]: lo: Link UP Jan 27 05:55:00.641945 systemd-networkd[1499]: lo: Gained carrier Jan 27 05:55:00.645930 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 27 05:55:00.647013 systemd-networkd[1499]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 05:55:00.647465 systemd-networkd[1499]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 27 05:55:00.651809 systemd-networkd[1499]: eth0: Link UP Jan 27 05:55:00.654215 systemd-networkd[1499]: eth0: Gained carrier Jan 27 05:55:00.654252 systemd-networkd[1499]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 05:55:00.654722 systemd[1]: Reached target network.target - Network. Jan 27 05:55:00.668629 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 27 05:55:00.672900 systemd-networkd[1499]: eth0: Overlong DHCP hostname received, shortened from 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf.c.flatcar-212911.internal' to 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:55:00.672925 systemd-networkd[1499]: eth0: DHCPv4 address 10.128.0.23/32, gateway 10.128.0.1 acquired from 169.254.169.254 Jan 27 05:55:00.681742 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 27 05:55:00.730324 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input4 Jan 27 05:55:00.769442 kernel: ACPI: button: Sleep Button [SLPF] Jan 27 05:55:00.811723 kernel: EDAC MC: Ver: 3.0.0 Jan 27 05:55:00.894720 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 27 05:55:01.047220 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 05:55:01.193666 ldconfig[1417]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 27 05:55:01.202345 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 27 05:55:01.206807 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 27 05:55:01.241577 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 05:55:01.261909 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 27 05:55:01.283259 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Jan 27 05:55:01.293521 systemd[1]: Reached target sysinit.target - System Initialization. Jan 27 05:55:01.302699 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 27 05:55:01.312583 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 27 05:55:01.322550 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 27 05:55:01.332742 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 27 05:55:01.341669 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 27 05:55:01.351747 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 27 05:55:01.361726 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 27 05:55:01.370516 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 27 05:55:01.380544 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 27 05:55:01.380602 systemd[1]: Reached target paths.target - Path Units. Jan 27 05:55:01.389507 systemd[1]: Reached target timers.target - Timer Units. Jan 27 05:55:01.400043 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 27 05:55:01.412278 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 27 05:55:01.421820 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 27 05:55:01.431711 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 27 05:55:01.441505 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 27 05:55:01.461249 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 27 05:55:01.470999 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 27 05:55:01.483213 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 27 05:55:01.495512 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 27 05:55:01.506088 systemd[1]: Reached target sockets.target - Socket Units. Jan 27 05:55:01.515550 systemd[1]: Reached target basic.target - Basic System. Jan 27 05:55:01.515922 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 27 05:55:01.515966 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 27 05:55:01.518581 systemd[1]: Starting containerd.service - containerd container runtime... Jan 27 05:55:01.521147 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 27 05:55:01.527649 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 27 05:55:01.535059 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 27 05:55:01.558790 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 27 05:55:01.569803 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 27 05:55:01.578494 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 27 05:55:01.582609 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 27 05:55:01.598922 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 27 05:55:01.610687 systemd[1]: Started ntpd.service - Network Time Service. Jan 27 05:55:01.622562 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 27 05:55:01.631306 jq[1566]: false Jan 27 05:55:01.633597 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 27 05:55:01.646756 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 27 05:55:01.654665 google_oslogin_nss_cache[1568]: oslogin_cache_refresh[1568]: Refreshing passwd entry cache Jan 27 05:55:01.655056 oslogin_cache_refresh[1568]: Refreshing passwd entry cache Jan 27 05:55:01.664717 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 27 05:55:01.670615 google_oslogin_nss_cache[1568]: oslogin_cache_refresh[1568]: Failure getting users, quitting Jan 27 05:55:01.670615 google_oslogin_nss_cache[1568]: oslogin_cache_refresh[1568]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 27 05:55:01.670615 google_oslogin_nss_cache[1568]: oslogin_cache_refresh[1568]: Refreshing group entry cache Jan 27 05:55:01.669695 oslogin_cache_refresh[1568]: Failure getting users, quitting Jan 27 05:55:01.669736 oslogin_cache_refresh[1568]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 27 05:55:01.669797 oslogin_cache_refresh[1568]: Refreshing group entry cache Jan 27 05:55:01.673543 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionSecurity=!tpm2). Jan 27 05:55:01.675097 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 27 05:55:01.678865 systemd[1]: Starting update-engine.service - Update Engine... Jan 27 05:55:01.683582 google_oslogin_nss_cache[1568]: oslogin_cache_refresh[1568]: Failure getting groups, quitting Jan 27 05:55:01.683709 oslogin_cache_refresh[1568]: Failure getting groups, quitting Jan 27 05:55:01.683842 google_oslogin_nss_cache[1568]: oslogin_cache_refresh[1568]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 27 05:55:01.683905 oslogin_cache_refresh[1568]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 27 05:55:01.698655 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 27 05:55:01.705175 extend-filesystems[1567]: Found /dev/sda6 Jan 27 05:55:01.711692 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 27 05:55:01.727130 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 27 05:55:01.746337 extend-filesystems[1567]: Found /dev/sda9 Jan 27 05:55:01.744944 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 27 05:55:01.761782 extend-filesystems[1567]: Checking size of /dev/sda9 Jan 27 05:55:01.745341 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 27 05:55:01.775062 jq[1581]: true Jan 27 05:55:01.745878 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 27 05:55:01.746237 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 27 05:55:01.752722 systemd[1]: motdgen.service: Deactivated successfully. Jan 27 05:55:01.753068 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 27 05:55:01.782652 coreos-metadata[1563]: Jan 27 05:55:01.781 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/hostname: Attempt #1 Jan 27 05:55:01.784304 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 27 05:55:01.785521 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 27 05:55:01.788210 coreos-metadata[1563]: Jan 27 05:55:01.788 INFO Fetch successful Jan 27 05:55:01.788210 coreos-metadata[1563]: Jan 27 05:55:01.788 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/access-configs/0/external-ip: Attempt #1 Jan 27 05:55:01.790652 coreos-metadata[1563]: Jan 27 05:55:01.790 INFO Fetch successful Jan 27 05:55:01.790652 coreos-metadata[1563]: Jan 27 05:55:01.790 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/ip: Attempt #1 Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: ntpd 4.2.8p18@1.4062-o Tue Jan 27 02:40:04 UTC 2026 (1): Starting Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: ---------------------------------------------------- Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: ntp-4 is maintained by Network Time Foundation, Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: corporation. Support and training for ntp-4 are Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: available at https://www.nwtime.org/support Jan 27 05:55:01.790791 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: ---------------------------------------------------- Jan 27 05:55:01.788989 ntpd[1570]: ntpd 4.2.8p18@1.4062-o Tue Jan 27 02:40:04 UTC 2026 (1): Starting Jan 27 05:55:01.789064 ntpd[1570]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Jan 27 05:55:01.789081 ntpd[1570]: ---------------------------------------------------- Jan 27 05:55:01.789095 ntpd[1570]: ntp-4 is maintained by Network Time Foundation, Jan 27 05:55:01.789108 ntpd[1570]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Jan 27 05:55:01.789122 ntpd[1570]: corporation. Support and training for ntp-4 are Jan 27 05:55:01.789135 ntpd[1570]: available at https://www.nwtime.org/support Jan 27 05:55:01.789149 ntpd[1570]: ---------------------------------------------------- Jan 27 05:55:01.802897 ntpd[1570]: proto: precision = 0.107 usec (-23) Jan 27 05:55:01.804984 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: proto: precision = 0.107 usec (-23) Jan 27 05:55:01.805079 extend-filesystems[1567]: Resized partition /dev/sda9 Jan 27 05:55:01.811529 coreos-metadata[1563]: Jan 27 05:55:01.792 INFO Fetch successful Jan 27 05:55:01.811529 coreos-metadata[1563]: Jan 27 05:55:01.792 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/machine-type: Attempt #1 Jan 27 05:55:01.811529 coreos-metadata[1563]: Jan 27 05:55:01.805 INFO Fetch successful Jan 27 05:55:01.811672 update_engine[1580]: I20260127 05:55:01.796463 1580 main.cc:92] Flatcar Update Engine starting Jan 27 05:55:01.812094 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: basedate set to 2026-01-15 Jan 27 05:55:01.812094 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: gps base set to 2026-01-18 (week 2402) Jan 27 05:55:01.812094 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Listen and drop on 0 v6wildcard [::]:123 Jan 27 05:55:01.812094 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Jan 27 05:55:01.812094 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Listen normally on 2 lo 127.0.0.1:123 Jan 27 05:55:01.812094 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Listen normally on 3 eth0 10.128.0.23:123 Jan 27 05:55:01.808523 ntpd[1570]: basedate set to 2026-01-15 Jan 27 05:55:01.808548 ntpd[1570]: gps base set to 2026-01-18 (week 2402) Jan 27 05:55:01.817548 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Listen normally on 4 lo [::1]:123 Jan 27 05:55:01.817548 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: bind(21) AF_INET6 [fe80::4001:aff:fe80:17%2]:123 flags 0x811 failed: Cannot assign requested address Jan 27 05:55:01.817548 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: unable to create socket on eth0 (5) for [fe80::4001:aff:fe80:17%2]:123 Jan 27 05:55:01.817548 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: cannot bind address fe80::4001:aff:fe80:17%2 Jan 27 05:55:01.817548 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: Listening on routing socket on fd #21 for interface updates Jan 27 05:55:01.808737 ntpd[1570]: Listen and drop on 0 v6wildcard [::]:123 Jan 27 05:55:01.808781 ntpd[1570]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Jan 27 05:55:01.809039 ntpd[1570]: Listen normally on 2 lo 127.0.0.1:123 Jan 27 05:55:01.809081 ntpd[1570]: Listen normally on 3 eth0 10.128.0.23:123 Jan 27 05:55:01.813435 ntpd[1570]: Listen normally on 4 lo [::1]:123 Jan 27 05:55:01.813502 ntpd[1570]: bind(21) AF_INET6 [fe80::4001:aff:fe80:17%2]:123 flags 0x811 failed: Cannot assign requested address Jan 27 05:55:01.813535 ntpd[1570]: unable to create socket on eth0 (5) for [fe80::4001:aff:fe80:17%2]:123 Jan 27 05:55:01.813556 ntpd[1570]: cannot bind address fe80::4001:aff:fe80:17%2 Jan 27 05:55:01.813597 ntpd[1570]: Listening on routing socket on fd #21 for interface updates Jan 27 05:55:01.829545 extend-filesystems[1610]: resize2fs 1.47.3 (8-Jul-2025) Jan 27 05:55:01.884100 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 2604027 blocks Jan 27 05:55:01.884151 jq[1607]: true Jan 27 05:55:01.865529 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 27 05:55:01.849073 ntpd[1570]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Jan 27 05:55:01.884889 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Jan 27 05:55:01.884889 ntpd[1570]: 27 Jan 05:55:01 ntpd[1570]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Jan 27 05:55:01.849110 ntpd[1570]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Jan 27 05:55:01.909701 kernel: EXT4-fs (sda9): resized filesystem to 2604027 Jan 27 05:55:01.941093 extend-filesystems[1610]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 27 05:55:01.941093 extend-filesystems[1610]: old_desc_blocks = 1, new_desc_blocks = 2 Jan 27 05:55:01.941093 extend-filesystems[1610]: The filesystem on /dev/sda9 is now 2604027 (4k) blocks long. Jan 27 05:55:01.979618 extend-filesystems[1567]: Resized filesystem in /dev/sda9 Jan 27 05:55:01.943650 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 27 05:55:01.987734 tar[1600]: linux-amd64/LICENSE Jan 27 05:55:01.987734 tar[1600]: linux-amd64/helm Jan 27 05:55:01.944106 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 27 05:55:02.007693 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 27 05:55:02.013236 systemd-logind[1576]: Watching system buttons on /dev/input/event2 (Power Button) Jan 27 05:55:02.013283 systemd-logind[1576]: Watching system buttons on /dev/input/event3 (Sleep Button) Jan 27 05:55:02.013316 systemd-logind[1576]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 27 05:55:02.022222 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 27 05:55:02.022365 systemd-logind[1576]: New seat seat0. Jan 27 05:55:02.033075 systemd[1]: Started systemd-logind.service - User Login Management. Jan 27 05:55:02.056375 systemd-networkd[1499]: eth0: Gained IPv6LL Jan 27 05:55:02.075078 bash[1647]: Updated "/home/core/.ssh/authorized_keys" Jan 27 05:55:02.069722 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 27 05:55:02.086242 systemd[1]: Starting sshkeys.service... Jan 27 05:55:02.103516 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 27 05:55:02.113710 systemd[1]: Reached target network-online.target - Network is Online. Jan 27 05:55:02.126157 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 05:55:02.138708 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 27 05:55:02.142946 dbus-daemon[1564]: [system] SELinux support is enabled Jan 27 05:55:02.154336 systemd[1]: Starting oem-gce.service - GCE Linux Agent... Jan 27 05:55:02.162850 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 27 05:55:02.177671 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 27 05:55:02.177708 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 27 05:55:02.182742 dbus-daemon[1564]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.3' (uid=244 pid=1499 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Jan 27 05:55:02.187824 update_engine[1580]: I20260127 05:55:02.185889 1580 update_check_scheduler.cc:74] Next update check in 2m45s Jan 27 05:55:02.187569 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 27 05:55:02.187601 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 27 05:55:02.222219 systemd[1]: Started update-engine.service - Update Engine. Jan 27 05:55:02.227616 dbus-daemon[1564]: [system] Successfully activated service 'org.freedesktop.systemd1' Jan 27 05:55:02.248624 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 27 05:55:02.284043 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 27 05:55:02.290415 init.sh[1654]: + '[' -e /etc/default/instance_configs.cfg.template ']' Jan 27 05:55:02.290415 init.sh[1654]: + echo -e '[InstanceSetup]\nset_host_keys = false' Jan 27 05:55:02.294047 init.sh[1654]: + /usr/bin/google_instance_setup Jan 27 05:55:02.299019 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Jan 27 05:55:02.359222 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 27 05:55:02.417217 coreos-metadata[1657]: Jan 27 05:55:02.417 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/sshKeys: Attempt #1 Jan 27 05:55:02.432524 coreos-metadata[1657]: Jan 27 05:55:02.431 INFO Fetch failed with 404: resource not found Jan 27 05:55:02.432524 coreos-metadata[1657]: Jan 27 05:55:02.431 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/ssh-keys: Attempt #1 Jan 27 05:55:02.432524 coreos-metadata[1657]: Jan 27 05:55:02.432 INFO Fetch successful Jan 27 05:55:02.432524 coreos-metadata[1657]: Jan 27 05:55:02.432 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/block-project-ssh-keys: Attempt #1 Jan 27 05:55:02.433571 coreos-metadata[1657]: Jan 27 05:55:02.433 INFO Fetch failed with 404: resource not found Jan 27 05:55:02.433571 coreos-metadata[1657]: Jan 27 05:55:02.433 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/sshKeys: Attempt #1 Jan 27 05:55:02.434271 coreos-metadata[1657]: Jan 27 05:55:02.433 INFO Fetch failed with 404: resource not found Jan 27 05:55:02.434271 coreos-metadata[1657]: Jan 27 05:55:02.433 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/ssh-keys: Attempt #1 Jan 27 05:55:02.470507 coreos-metadata[1657]: Jan 27 05:55:02.463 INFO Fetch successful Jan 27 05:55:02.476295 unknown[1657]: wrote ssh authorized keys file for user: core Jan 27 05:55:02.484545 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 27 05:55:02.621331 update-ssh-keys[1676]: Updated "/home/core/.ssh/authorized_keys" Jan 27 05:55:02.621945 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 27 05:55:02.638204 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Jan 27 05:55:02.650008 systemd[1]: Finished sshkeys.service. Jan 27 05:55:02.662058 sshd_keygen[1603]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 27 05:55:02.683173 dbus-daemon[1564]: [system] Successfully activated service 'org.freedesktop.hostname1' Jan 27 05:55:02.713451 dbus-daemon[1564]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1660 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Jan 27 05:55:02.730835 systemd[1]: Starting polkit.service - Authorization Manager... Jan 27 05:55:02.789407 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 27 05:55:02.808196 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 27 05:55:02.825572 systemd[1]: Started sshd@0-10.128.0.23:22-4.153.228.146:39494.service - OpenSSH per-connection server daemon (4.153.228.146:39494). Jan 27 05:55:02.898600 systemd[1]: issuegen.service: Deactivated successfully. Jan 27 05:55:02.899081 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 27 05:55:02.911894 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 27 05:55:02.955565 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 27 05:55:02.957804 locksmithd[1662]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 27 05:55:02.969029 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 27 05:55:02.984961 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 27 05:55:02.994137 systemd[1]: Reached target getty.target - Login Prompts. Jan 27 05:55:03.020637 polkitd[1685]: Started polkitd version 126 Jan 27 05:55:03.045336 polkitd[1685]: Loading rules from directory /etc/polkit-1/rules.d Jan 27 05:55:03.046033 polkitd[1685]: Loading rules from directory /run/polkit-1/rules.d Jan 27 05:55:03.046098 polkitd[1685]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Jan 27 05:55:03.046836 polkitd[1685]: Loading rules from directory /usr/local/share/polkit-1/rules.d Jan 27 05:55:03.046897 polkitd[1685]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Jan 27 05:55:03.047017 polkitd[1685]: Loading rules from directory /usr/share/polkit-1/rules.d Jan 27 05:55:03.051581 polkitd[1685]: Finished loading, compiling and executing 2 rules Jan 27 05:55:03.053672 systemd[1]: Started polkit.service - Authorization Manager. Jan 27 05:55:03.056455 dbus-daemon[1564]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Jan 27 05:55:03.063375 polkitd[1685]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Jan 27 05:55:03.097160 containerd[1613]: time="2026-01-27T05:55:03Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 27 05:55:03.099839 containerd[1613]: time="2026-01-27T05:55:03.099789867Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 27 05:55:03.108243 systemd-hostnamed[1660]: Hostname set to (transient) Jan 27 05:55:03.109332 systemd-resolved[1243]: System hostname changed to 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf'. Jan 27 05:55:03.132220 containerd[1613]: time="2026-01-27T05:55:03.132163992Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.969µs" Jan 27 05:55:03.132220 containerd[1613]: time="2026-01-27T05:55:03.132215426Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 27 05:55:03.132355 containerd[1613]: time="2026-01-27T05:55:03.132269144Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 27 05:55:03.132355 containerd[1613]: time="2026-01-27T05:55:03.132289755Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.132516679Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.132565418Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.132651809Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.132670199Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.132967111Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.132990684Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.133010156Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.133024398Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.133244744Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 27 05:55:03.133299 containerd[1613]: time="2026-01-27T05:55:03.133265432Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 27 05:55:03.136857 containerd[1613]: time="2026-01-27T05:55:03.135059179Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 27 05:55:03.136857 containerd[1613]: time="2026-01-27T05:55:03.135423985Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 27 05:55:03.136857 containerd[1613]: time="2026-01-27T05:55:03.135477887Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 27 05:55:03.136857 containerd[1613]: time="2026-01-27T05:55:03.135496318Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 27 05:55:03.136857 containerd[1613]: time="2026-01-27T05:55:03.135553093Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 27 05:55:03.136857 containerd[1613]: time="2026-01-27T05:55:03.136025524Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 27 05:55:03.136857 containerd[1613]: time="2026-01-27T05:55:03.136121842Z" level=info msg="metadata content store policy set" policy=shared Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.141829168Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.141903849Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142144601Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142170585Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142193190Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142212679Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142232430Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142249459Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142268778Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142290298Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142308287Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142326094Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142341265Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 27 05:55:03.142981 containerd[1613]: time="2026-01-27T05:55:03.142360743Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143103323Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143141511Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143168159Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143186502Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143206688Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143225100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143263769Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143283648Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143303924Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143322012Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143340708Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143377340Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143486435Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 27 05:55:03.143619 containerd[1613]: time="2026-01-27T05:55:03.143508483Z" level=info msg="Start snapshots syncer" Jan 27 05:55:03.145500 containerd[1613]: time="2026-01-27T05:55:03.144421042Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 27 05:55:03.145500 containerd[1613]: time="2026-01-27T05:55:03.144833115Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.144922787Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.144979549Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145119331Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145339491Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145366722Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145441668Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145463643Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145490704Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145508929Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145527058Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 27 05:55:03.145764 containerd[1613]: time="2026-01-27T05:55:03.145544667Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146437188Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146569262Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146590416Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146608807Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146623529Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146640863Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146659297Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146677996Z" level=info msg="runtime interface created" Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146687054Z" level=info msg="created NRI interface" Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146700871Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146719311Z" level=info msg="Connect containerd service" Jan 27 05:55:03.147082 containerd[1613]: time="2026-01-27T05:55:03.146781513Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 27 05:55:03.149257 containerd[1613]: time="2026-01-27T05:55:03.149176042Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388479572Z" level=info msg="Start subscribing containerd event" Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388549624Z" level=info msg="Start recovering state" Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388676441Z" level=info msg="Start event monitor" Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388694413Z" level=info msg="Start cni network conf syncer for default" Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388707091Z" level=info msg="Start streaming server" Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388728013Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388742531Z" level=info msg="runtime interface starting up..." Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388752911Z" level=info msg="starting plugins..." Jan 27 05:55:03.390523 containerd[1613]: time="2026-01-27T05:55:03.388771760Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 27 05:55:03.394797 containerd[1613]: time="2026-01-27T05:55:03.391448170Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 27 05:55:03.394797 containerd[1613]: time="2026-01-27T05:55:03.391665715Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 27 05:55:03.396652 systemd[1]: Started containerd.service - containerd container runtime. Jan 27 05:55:03.397967 containerd[1613]: time="2026-01-27T05:55:03.396829787Z" level=info msg="containerd successfully booted in 0.303910s" Jan 27 05:55:03.421297 sshd[1692]: Accepted publickey for core from 4.153.228.146 port 39494 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:03.427740 sshd-session[1692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:03.448311 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 27 05:55:03.461776 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 27 05:55:03.495991 systemd-logind[1576]: New session 1 of user core. Jan 27 05:55:03.512270 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 27 05:55:03.531542 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 27 05:55:03.582454 (systemd)[1730]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:03.592692 systemd-logind[1576]: New session 2 of user core. Jan 27 05:55:03.914972 systemd[1730]: Queued start job for default target default.target. Jan 27 05:55:03.924004 systemd[1730]: Created slice app.slice - User Application Slice. Jan 27 05:55:03.924057 systemd[1730]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 27 05:55:03.924083 systemd[1730]: Reached target paths.target - Paths. Jan 27 05:55:03.924159 systemd[1730]: Reached target timers.target - Timers. Jan 27 05:55:03.926912 systemd[1730]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 27 05:55:03.930822 systemd[1730]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 27 05:55:03.973316 systemd[1730]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 27 05:55:03.979214 systemd[1730]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 27 05:55:03.979433 systemd[1730]: Reached target sockets.target - Sockets. Jan 27 05:55:03.980316 systemd[1730]: Reached target basic.target - Basic System. Jan 27 05:55:03.981010 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 27 05:55:03.981077 systemd[1730]: Reached target default.target - Main User Target. Jan 27 05:55:03.981146 systemd[1730]: Startup finished in 370ms. Jan 27 05:55:03.989458 instance-setup[1659]: INFO Running google_set_multiqueue. Jan 27 05:55:04.000649 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 27 05:55:04.030897 instance-setup[1659]: INFO Set channels for eth0 to 2. Jan 27 05:55:04.040964 tar[1600]: linux-amd64/README.md Jan 27 05:55:04.044175 instance-setup[1659]: INFO Setting /proc/irq/31/smp_affinity_list to 0 for device virtio1. Jan 27 05:55:04.047137 instance-setup[1659]: INFO /proc/irq/31/smp_affinity_list: real affinity 0 Jan 27 05:55:04.047223 instance-setup[1659]: INFO Setting /proc/irq/32/smp_affinity_list to 0 for device virtio1. Jan 27 05:55:04.048658 instance-setup[1659]: INFO /proc/irq/32/smp_affinity_list: real affinity 0 Jan 27 05:55:04.049181 instance-setup[1659]: INFO Setting /proc/irq/33/smp_affinity_list to 1 for device virtio1. Jan 27 05:55:04.051334 instance-setup[1659]: INFO /proc/irq/33/smp_affinity_list: real affinity 1 Jan 27 05:55:04.051894 instance-setup[1659]: INFO Setting /proc/irq/34/smp_affinity_list to 1 for device virtio1. Jan 27 05:55:04.058544 instance-setup[1659]: INFO /proc/irq/34/smp_affinity_list: real affinity 1 Jan 27 05:55:04.080642 instance-setup[1659]: INFO /usr/bin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Jan 27 05:55:04.084519 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 27 05:55:04.090816 instance-setup[1659]: INFO /usr/bin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Jan 27 05:55:04.093333 instance-setup[1659]: INFO Queue 0 XPS=1 for /sys/class/net/eth0/queues/tx-0/xps_cpus Jan 27 05:55:04.094163 instance-setup[1659]: INFO Queue 1 XPS=2 for /sys/class/net/eth0/queues/tx-1/xps_cpus Jan 27 05:55:04.130346 init.sh[1654]: + /usr/bin/google_metadata_script_runner --script-type startup Jan 27 05:55:04.136298 systemd[1]: Started sshd@1-10.128.0.23:22-4.153.228.146:39604.service - OpenSSH per-connection server daemon (4.153.228.146:39604). Jan 27 05:55:04.362803 startup-script[1775]: INFO Starting startup scripts. Jan 27 05:55:04.372672 startup-script[1775]: INFO No startup scripts found in metadata. Jan 27 05:55:04.372754 startup-script[1775]: INFO Finished running startup scripts. Jan 27 05:55:04.397417 init.sh[1654]: + trap 'stopping=1 ; kill "${daemon_pids[@]}" || :' SIGTERM Jan 27 05:55:04.397417 init.sh[1654]: + daemon_pids=() Jan 27 05:55:04.397417 init.sh[1654]: + for d in accounts clock_skew network Jan 27 05:55:04.397417 init.sh[1654]: + daemon_pids+=($!) Jan 27 05:55:04.397417 init.sh[1654]: + for d in accounts clock_skew network Jan 27 05:55:04.397754 init.sh[1782]: + /usr/bin/google_clock_skew_daemon Jan 27 05:55:04.400091 init.sh[1781]: + /usr/bin/google_accounts_daemon Jan 27 05:55:04.401197 init.sh[1654]: + daemon_pids+=($!) Jan 27 05:55:04.401197 init.sh[1654]: + for d in accounts clock_skew network Jan 27 05:55:04.401197 init.sh[1654]: + daemon_pids+=($!) Jan 27 05:55:04.401197 init.sh[1654]: + NOTIFY_SOCKET=/run/systemd/notify Jan 27 05:55:04.401197 init.sh[1654]: + /usr/bin/systemd-notify --ready Jan 27 05:55:04.404015 init.sh[1784]: + /usr/bin/google_network_daemon Jan 27 05:55:04.405979 sshd[1776]: Accepted publickey for core from 4.153.228.146 port 39604 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:04.411311 sshd-session[1776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:04.428639 systemd-logind[1576]: New session 3 of user core. Jan 27 05:55:04.434396 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 27 05:55:04.443988 systemd[1]: Started oem-gce.service - GCE Linux Agent. Jan 27 05:55:04.463186 init.sh[1654]: + wait -n 1781 1782 1784 Jan 27 05:55:04.543024 sshd[1787]: Connection closed by 4.153.228.146 port 39604 Jan 27 05:55:04.543923 sshd-session[1776]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:04.554285 systemd[1]: sshd@1-10.128.0.23:22-4.153.228.146:39604.service: Deactivated successfully. Jan 27 05:55:04.559901 systemd[1]: session-3.scope: Deactivated successfully. Jan 27 05:55:04.562637 systemd-logind[1576]: Session 3 logged out. Waiting for processes to exit. Jan 27 05:55:04.568788 systemd-logind[1576]: Removed session 3. Jan 27 05:55:04.590921 systemd[1]: Started sshd@2-10.128.0.23:22-4.153.228.146:39614.service - OpenSSH per-connection server daemon (4.153.228.146:39614). Jan 27 05:55:04.791721 ntpd[1570]: Listen normally on 6 eth0 [fe80::4001:aff:fe80:17%2]:123 Jan 27 05:55:04.792169 ntpd[1570]: 27 Jan 05:55:04 ntpd[1570]: Listen normally on 6 eth0 [fe80::4001:aff:fe80:17%2]:123 Jan 27 05:55:04.876412 sshd[1793]: Accepted publickey for core from 4.153.228.146 port 39614 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:04.879056 sshd-session[1793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:04.894493 systemd-logind[1576]: New session 4 of user core. Jan 27 05:55:04.901660 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 27 05:55:04.934108 google-clock-skew[1782]: INFO Starting Google Clock Skew daemon. Jan 27 05:55:04.947557 google-clock-skew[1782]: INFO Clock drift token has changed: 0. Jan 27 05:55:04.979537 google-networking[1784]: INFO Starting Google Networking daemon. Jan 27 05:55:05.000425 sshd[1803]: Connection closed by 4.153.228.146 port 39614 Jan 27 05:55:05.001578 sshd-session[1793]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:05.009171 groupadd[1804]: group added to /etc/group: name=google-sudoers, GID=1000 Jan 27 05:55:05.013561 systemd[1]: sshd@2-10.128.0.23:22-4.153.228.146:39614.service: Deactivated successfully. Jan 27 05:55:05.016527 groupadd[1804]: group added to /etc/gshadow: name=google-sudoers Jan 27 05:55:05.024523 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:05.035316 systemd[1]: session-4.scope: Deactivated successfully. Jan 27 05:55:05.039938 (kubelet)[1816]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 05:55:05.043921 systemd-logind[1576]: Session 4 logged out. Waiting for processes to exit. Jan 27 05:55:05.045235 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 27 05:55:05.055148 systemd[1]: Startup finished in 2.805s (kernel) + 8.609s (initrd) + 10.022s (userspace) = 21.437s. Jan 27 05:55:05.057449 systemd-logind[1576]: Removed session 4. Jan 27 05:55:05.182916 groupadd[1804]: new group: name=google-sudoers, GID=1000 Jan 27 05:55:05.241967 google-accounts[1781]: INFO Starting Google Accounts daemon. Jan 27 05:55:05.259335 google-accounts[1781]: WARNING OS Login not installed. Jan 27 05:55:05.260891 google-accounts[1781]: INFO Creating a new user account for 0. Jan 27 05:55:05.268561 init.sh[1831]: useradd: invalid user name '0': use --badname to ignore Jan 27 05:55:05.268843 google-accounts[1781]: WARNING Could not create user 0. Command '['useradd', '-m', '-s', '/bin/bash', '-p', '*', '0']' returned non-zero exit status 3.. Jan 27 05:55:06.000093 systemd-resolved[1243]: Clock change detected. Flushing caches. Jan 27 05:55:06.000706 google-clock-skew[1782]: INFO Synced system time with hardware clock. Jan 27 05:55:06.321396 kubelet[1816]: E0127 05:55:06.321223 1816 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 05:55:06.324433 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 05:55:06.324705 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 05:55:06.325515 systemd[1]: kubelet.service: Consumed 1.306s CPU time, 263.9M memory peak. Jan 27 05:55:15.445281 systemd[1]: Started sshd@3-10.128.0.23:22-4.153.228.146:53918.service - OpenSSH per-connection server daemon (4.153.228.146:53918). Jan 27 05:55:15.672417 sshd[1837]: Accepted publickey for core from 4.153.228.146 port 53918 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:15.673916 sshd-session[1837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:15.682031 systemd-logind[1576]: New session 5 of user core. Jan 27 05:55:15.685612 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 27 05:55:15.770897 sshd[1841]: Connection closed by 4.153.228.146 port 53918 Jan 27 05:55:15.771841 sshd-session[1837]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:15.778004 systemd[1]: sshd@3-10.128.0.23:22-4.153.228.146:53918.service: Deactivated successfully. Jan 27 05:55:15.780694 systemd[1]: session-5.scope: Deactivated successfully. Jan 27 05:55:15.783996 systemd-logind[1576]: Session 5 logged out. Waiting for processes to exit. Jan 27 05:55:15.785550 systemd-logind[1576]: Removed session 5. Jan 27 05:55:15.812671 systemd[1]: Started sshd@4-10.128.0.23:22-4.153.228.146:53932.service - OpenSSH per-connection server daemon (4.153.228.146:53932). Jan 27 05:55:16.040091 sshd[1847]: Accepted publickey for core from 4.153.228.146 port 53932 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:16.041076 sshd-session[1847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:16.048435 systemd-logind[1576]: New session 6 of user core. Jan 27 05:55:16.055626 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 27 05:55:16.131275 sshd[1851]: Connection closed by 4.153.228.146 port 53932 Jan 27 05:55:16.132654 sshd-session[1847]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:16.139159 systemd[1]: sshd@4-10.128.0.23:22-4.153.228.146:53932.service: Deactivated successfully. Jan 27 05:55:16.141584 systemd[1]: session-6.scope: Deactivated successfully. Jan 27 05:55:16.143611 systemd-logind[1576]: Session 6 logged out. Waiting for processes to exit. Jan 27 05:55:16.145049 systemd-logind[1576]: Removed session 6. Jan 27 05:55:16.171426 systemd[1]: Started sshd@5-10.128.0.23:22-4.153.228.146:53948.service - OpenSSH per-connection server daemon (4.153.228.146:53948). Jan 27 05:55:16.351828 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 27 05:55:16.356665 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 05:55:16.404857 sshd[1857]: Accepted publickey for core from 4.153.228.146 port 53948 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:16.406642 sshd-session[1857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:16.413221 systemd-logind[1576]: New session 7 of user core. Jan 27 05:55:16.421023 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 27 05:55:16.503664 sshd[1864]: Connection closed by 4.153.228.146 port 53948 Jan 27 05:55:16.505642 sshd-session[1857]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:16.514667 systemd[1]: sshd@5-10.128.0.23:22-4.153.228.146:53948.service: Deactivated successfully. Jan 27 05:55:16.517222 systemd[1]: session-7.scope: Deactivated successfully. Jan 27 05:55:16.518730 systemd-logind[1576]: Session 7 logged out. Waiting for processes to exit. Jan 27 05:55:16.523543 systemd-logind[1576]: Removed session 7. Jan 27 05:55:16.551263 systemd[1]: Started sshd@6-10.128.0.23:22-4.153.228.146:53960.service - OpenSSH per-connection server daemon (4.153.228.146:53960). Jan 27 05:55:16.733191 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:16.747856 (kubelet)[1878]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 05:55:16.777293 sshd[1870]: Accepted publickey for core from 4.153.228.146 port 53960 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:16.779635 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:16.791334 systemd-logind[1576]: New session 8 of user core. Jan 27 05:55:16.794627 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 27 05:55:16.812448 kubelet[1878]: E0127 05:55:16.812390 1878 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 05:55:16.816815 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 05:55:16.817042 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 05:55:16.817640 systemd[1]: kubelet.service: Consumed 218ms CPU time, 107.8M memory peak. Jan 27 05:55:16.870618 sudo[1887]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 27 05:55:16.871189 sudo[1887]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 05:55:16.885668 sudo[1887]: pam_unix(sudo:session): session closed for user root Jan 27 05:55:16.916693 sshd[1885]: Connection closed by 4.153.228.146 port 53960 Jan 27 05:55:16.917170 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:16.923997 systemd[1]: sshd@6-10.128.0.23:22-4.153.228.146:53960.service: Deactivated successfully. Jan 27 05:55:16.926550 systemd[1]: session-8.scope: Deactivated successfully. Jan 27 05:55:16.929191 systemd-logind[1576]: Session 8 logged out. Waiting for processes to exit. Jan 27 05:55:16.930990 systemd-logind[1576]: Removed session 8. Jan 27 05:55:16.959102 systemd[1]: Started sshd@7-10.128.0.23:22-4.153.228.146:53972.service - OpenSSH per-connection server daemon (4.153.228.146:53972). Jan 27 05:55:17.187004 sshd[1894]: Accepted publickey for core from 4.153.228.146 port 53972 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:17.188838 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:17.197331 systemd-logind[1576]: New session 9 of user core. Jan 27 05:55:17.206611 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 27 05:55:17.264045 sudo[1900]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 27 05:55:17.264608 sudo[1900]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 05:55:17.268161 sudo[1900]: pam_unix(sudo:session): session closed for user root Jan 27 05:55:17.283305 sudo[1899]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 27 05:55:17.283863 sudo[1899]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 05:55:17.294297 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 27 05:55:17.344000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 05:55:17.362468 kernel: audit: type=1305 audit(1769493317.344:213): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 05:55:17.362555 kernel: audit: type=1300 audit(1769493317.344:213): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffde4b325b0 a2=420 a3=0 items=0 ppid=1905 pid=1924 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:17.344000 audit[1924]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffde4b325b0 a2=420 a3=0 items=0 ppid=1905 pid=1924 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:17.362710 augenrules[1924]: No rules Jan 27 05:55:17.361701 systemd[1]: audit-rules.service: Deactivated successfully. Jan 27 05:55:17.362090 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 27 05:55:17.375805 sudo[1899]: pam_unix(sudo:session): session closed for user root Jan 27 05:55:17.344000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:55:17.404774 kernel: audit: type=1327 audit(1769493317.344:213): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 05:55:17.359000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.405404 kernel: audit: type=1130 audit(1769493317.359:214): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.428632 sshd-session[1894]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:17.432943 sshd[1898]: Connection closed by 4.153.228.146 port 53972 Jan 27 05:55:17.439671 systemd-logind[1576]: Session 9 logged out. Waiting for processes to exit. Jan 27 05:55:17.444620 systemd[1]: sshd@7-10.128.0.23:22-4.153.228.146:53972.service: Deactivated successfully. Jan 27 05:55:17.448340 kernel: audit: type=1131 audit(1769493317.359:215): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.449228 kernel: audit: type=1106 audit(1769493317.374:216): pid=1899 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.374000 audit[1899]: USER_END pid=1899 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.448762 systemd[1]: session-9.scope: Deactivated successfully. Jan 27 05:55:17.452075 systemd-logind[1576]: Removed session 9. Jan 27 05:55:17.473445 kernel: audit: type=1104 audit(1769493317.374:217): pid=1899 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.374000 audit[1899]: CRED_DISP pid=1899 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.434000 audit[1894]: USER_END pid=1894 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.504268 systemd[1]: Started sshd@8-10.128.0.23:22-4.153.228.146:53978.service - OpenSSH per-connection server daemon (4.153.228.146:53978). Jan 27 05:55:17.530342 kernel: audit: type=1106 audit(1769493317.434:218): pid=1894 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.555728 kernel: audit: type=1104 audit(1769493317.434:219): pid=1894 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.434000 audit[1894]: CRED_DISP pid=1894 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.444000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.128.0.23:22-4.153.228.146:53972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.580545 kernel: audit: type=1131 audit(1769493317.444:220): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.128.0.23:22-4.153.228.146:53972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.503000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.128.0.23:22-4.153.228.146:53978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.740000 audit[1933]: USER_ACCT pid=1933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.742505 sshd[1933]: Accepted publickey for core from 4.153.228.146 port 53978 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:55:17.742000 audit[1933]: CRED_ACQ pid=1933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.742000 audit[1933]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffedefc9bc0 a2=3 a3=0 items=0 ppid=1 pid=1933 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:17.742000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:55:17.744696 sshd-session[1933]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:55:17.752439 systemd-logind[1576]: New session 10 of user core. Jan 27 05:55:17.759612 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 27 05:55:17.762000 audit[1933]: USER_START pid=1933 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.765000 audit[1937]: CRED_ACQ pid=1937 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:17.816000 audit[1938]: USER_ACCT pid=1938 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.818055 sudo[1938]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 27 05:55:17.816000 audit[1938]: CRED_REFR pid=1938 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.817000 audit[1938]: USER_START pid=1938 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:17.818615 sudo[1938]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 05:55:18.364248 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 27 05:55:18.375896 (dockerd)[1957]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 27 05:55:18.754328 dockerd[1957]: time="2026-01-27T05:55:18.753940512Z" level=info msg="Starting up" Jan 27 05:55:18.756349 dockerd[1957]: time="2026-01-27T05:55:18.756309710Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 27 05:55:18.772325 dockerd[1957]: time="2026-01-27T05:55:18.772276470Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 27 05:55:18.831089 dockerd[1957]: time="2026-01-27T05:55:18.831047028Z" level=info msg="Loading containers: start." Jan 27 05:55:18.847412 kernel: Initializing XFRM netlink socket Jan 27 05:55:18.926000 audit[2005]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2005 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.926000 audit[2005]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffecc45fba0 a2=0 a3=0 items=0 ppid=1957 pid=2005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.926000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 27 05:55:18.929000 audit[2007]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2007 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.929000 audit[2007]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffffbe2f2b0 a2=0 a3=0 items=0 ppid=1957 pid=2007 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.929000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 27 05:55:18.933000 audit[2009]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2009 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.933000 audit[2009]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc37202fb0 a2=0 a3=0 items=0 ppid=1957 pid=2009 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.933000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 27 05:55:18.936000 audit[2011]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2011 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.936000 audit[2011]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc4393c860 a2=0 a3=0 items=0 ppid=1957 pid=2011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.936000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 27 05:55:18.939000 audit[2013]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2013 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.939000 audit[2013]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffed4ee5b0 a2=0 a3=0 items=0 ppid=1957 pid=2013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.939000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 27 05:55:18.942000 audit[2015]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2015 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.942000 audit[2015]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd69537270 a2=0 a3=0 items=0 ppid=1957 pid=2015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.942000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 05:55:18.945000 audit[2017]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2017 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.945000 audit[2017]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe2906d150 a2=0 a3=0 items=0 ppid=1957 pid=2017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.945000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 05:55:18.949000 audit[2019]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2019 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.949000 audit[2019]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffed5b649e0 a2=0 a3=0 items=0 ppid=1957 pid=2019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.949000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 27 05:55:18.983000 audit[2022]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2022 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.983000 audit[2022]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7fffeb766f40 a2=0 a3=0 items=0 ppid=1957 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.983000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 27 05:55:18.986000 audit[2024]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.986000 audit[2024]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffceabd47d0 a2=0 a3=0 items=0 ppid=1957 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.986000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 27 05:55:18.990000 audit[2026]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2026 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.990000 audit[2026]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd1e1c1e50 a2=0 a3=0 items=0 ppid=1957 pid=2026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.990000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 27 05:55:18.993000 audit[2028]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2028 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.993000 audit[2028]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fffe7900160 a2=0 a3=0 items=0 ppid=1957 pid=2028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.993000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 05:55:18.997000 audit[2030]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2030 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:18.997000 audit[2030]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffc62910980 a2=0 a3=0 items=0 ppid=1957 pid=2030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:18.997000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 27 05:55:19.052000 audit[2060]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2060 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.052000 audit[2060]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffe380c8630 a2=0 a3=0 items=0 ppid=1957 pid=2060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.052000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 27 05:55:19.056000 audit[2062]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2062 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.056000 audit[2062]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffebf2ef8f0 a2=0 a3=0 items=0 ppid=1957 pid=2062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.056000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 27 05:55:19.059000 audit[2064]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2064 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.059000 audit[2064]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe9f2bb840 a2=0 a3=0 items=0 ppid=1957 pid=2064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.059000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 27 05:55:19.062000 audit[2066]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2066 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.062000 audit[2066]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd95503c70 a2=0 a3=0 items=0 ppid=1957 pid=2066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.062000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 27 05:55:19.065000 audit[2068]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2068 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.065000 audit[2068]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc28e8ac80 a2=0 a3=0 items=0 ppid=1957 pid=2068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.065000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 27 05:55:19.068000 audit[2070]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.068000 audit[2070]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffdb692c890 a2=0 a3=0 items=0 ppid=1957 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.068000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 05:55:19.071000 audit[2072]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.071000 audit[2072]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe705ea480 a2=0 a3=0 items=0 ppid=1957 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.071000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 05:55:19.074000 audit[2074]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.074000 audit[2074]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffcda36f210 a2=0 a3=0 items=0 ppid=1957 pid=2074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.074000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 27 05:55:19.077000 audit[2076]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2076 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.077000 audit[2076]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffffa1151f0 a2=0 a3=0 items=0 ppid=1957 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.077000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 27 05:55:19.080000 audit[2078]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2078 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.080000 audit[2078]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe0c1e9a00 a2=0 a3=0 items=0 ppid=1957 pid=2078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.080000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 27 05:55:19.084000 audit[2080]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2080 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.084000 audit[2080]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffda2357e20 a2=0 a3=0 items=0 ppid=1957 pid=2080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.084000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 27 05:55:19.087000 audit[2082]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2082 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.087000 audit[2082]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffcd89bf1f0 a2=0 a3=0 items=0 ppid=1957 pid=2082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.087000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 05:55:19.091000 audit[2084]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.091000 audit[2084]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fff86520940 a2=0 a3=0 items=0 ppid=1957 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.091000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 27 05:55:19.098000 audit[2089]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.098000 audit[2089]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffead75d590 a2=0 a3=0 items=0 ppid=1957 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.098000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 27 05:55:19.101000 audit[2091]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2091 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.101000 audit[2091]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fffdc54e480 a2=0 a3=0 items=0 ppid=1957 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.101000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 27 05:55:19.104000 audit[2093]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.104000 audit[2093]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd40348170 a2=0 a3=0 items=0 ppid=1957 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.104000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 27 05:55:19.107000 audit[2095]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2095 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.107000 audit[2095]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff6992e860 a2=0 a3=0 items=0 ppid=1957 pid=2095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.107000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 27 05:55:19.110000 audit[2097]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.110000 audit[2097]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffec66de410 a2=0 a3=0 items=0 ppid=1957 pid=2097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.110000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 27 05:55:19.113000 audit[2099]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:19.113000 audit[2099]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe42a77470 a2=0 a3=0 items=0 ppid=1957 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.113000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 27 05:55:19.140000 audit[2105]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2105 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.140000 audit[2105]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffeda494c50 a2=0 a3=0 items=0 ppid=1957 pid=2105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.140000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 27 05:55:19.144000 audit[2107]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.144000 audit[2107]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffd0cec0810 a2=0 a3=0 items=0 ppid=1957 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.144000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 27 05:55:19.160000 audit[2115]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.160000 audit[2115]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffedcccc180 a2=0 a3=0 items=0 ppid=1957 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.160000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 27 05:55:19.175000 audit[2121]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2121 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.175000 audit[2121]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffd76c6ac60 a2=0 a3=0 items=0 ppid=1957 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.175000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 27 05:55:19.179000 audit[2123]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.179000 audit[2123]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffe4b3342c0 a2=0 a3=0 items=0 ppid=1957 pid=2123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.179000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 27 05:55:19.183000 audit[2125]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.183000 audit[2125]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd173a0e00 a2=0 a3=0 items=0 ppid=1957 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.183000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 27 05:55:19.187000 audit[2127]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.187000 audit[2127]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffdf4e359b0 a2=0 a3=0 items=0 ppid=1957 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.187000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 05:55:19.190000 audit[2129]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:19.190000 audit[2129]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fff6ba63410 a2=0 a3=0 items=0 ppid=1957 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:19.190000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 27 05:55:19.192810 systemd-networkd[1499]: docker0: Link UP Jan 27 05:55:19.198551 dockerd[1957]: time="2026-01-27T05:55:19.198438285Z" level=info msg="Loading containers: done." Jan 27 05:55:19.217445 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck819335029-merged.mount: Deactivated successfully. Jan 27 05:55:19.220849 dockerd[1957]: time="2026-01-27T05:55:19.220792316Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 27 05:55:19.220985 dockerd[1957]: time="2026-01-27T05:55:19.220903015Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 27 05:55:19.221043 dockerd[1957]: time="2026-01-27T05:55:19.221027226Z" level=info msg="Initializing buildkit" Jan 27 05:55:19.250070 dockerd[1957]: time="2026-01-27T05:55:19.250027213Z" level=info msg="Completed buildkit initialization" Jan 27 05:55:19.261309 dockerd[1957]: time="2026-01-27T05:55:19.261259447Z" level=info msg="Daemon has completed initialization" Jan 27 05:55:19.261684 dockerd[1957]: time="2026-01-27T05:55:19.261451537Z" level=info msg="API listen on /run/docker.sock" Jan 27 05:55:19.261681 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 27 05:55:19.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:20.120818 containerd[1613]: time="2026-01-27T05:55:20.120763683Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 27 05:55:20.559236 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1084547345.mount: Deactivated successfully. Jan 27 05:55:22.202730 containerd[1613]: time="2026-01-27T05:55:22.202665892Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:22.203942 containerd[1613]: time="2026-01-27T05:55:22.203897426Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=0" Jan 27 05:55:22.205515 containerd[1613]: time="2026-01-27T05:55:22.205449181Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:22.208823 containerd[1613]: time="2026-01-27T05:55:22.208748958Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:22.210283 containerd[1613]: time="2026-01-27T05:55:22.210091893Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 2.089270831s" Jan 27 05:55:22.210283 containerd[1613]: time="2026-01-27T05:55:22.210137373Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 27 05:55:22.211347 containerd[1613]: time="2026-01-27T05:55:22.211300859Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 27 05:55:23.579957 containerd[1613]: time="2026-01-27T05:55:23.579887899Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:23.581291 containerd[1613]: time="2026-01-27T05:55:23.581236598Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=24985199" Jan 27 05:55:23.582544 containerd[1613]: time="2026-01-27T05:55:23.582479397Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:23.589383 containerd[1613]: time="2026-01-27T05:55:23.588642891Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:23.592177 containerd[1613]: time="2026-01-27T05:55:23.592131183Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.380773273s" Jan 27 05:55:23.592287 containerd[1613]: time="2026-01-27T05:55:23.592189625Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 27 05:55:23.594262 containerd[1613]: time="2026-01-27T05:55:23.594227870Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 27 05:55:24.684263 containerd[1613]: time="2026-01-27T05:55:24.684193226Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:24.685681 containerd[1613]: time="2026-01-27T05:55:24.685626222Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=19396939" Jan 27 05:55:24.686861 containerd[1613]: time="2026-01-27T05:55:24.686799444Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:24.690398 containerd[1613]: time="2026-01-27T05:55:24.690267267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:24.691833 containerd[1613]: time="2026-01-27T05:55:24.691548936Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.097279841s" Jan 27 05:55:24.691833 containerd[1613]: time="2026-01-27T05:55:24.691597878Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 27 05:55:24.692517 containerd[1613]: time="2026-01-27T05:55:24.692465113Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 27 05:55:25.759790 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4037389769.mount: Deactivated successfully. Jan 27 05:55:26.408548 containerd[1613]: time="2026-01-27T05:55:26.408481615Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:26.409940 containerd[1613]: time="2026-01-27T05:55:26.409881764Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=19572392" Jan 27 05:55:26.411073 containerd[1613]: time="2026-01-27T05:55:26.411010004Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:26.413398 containerd[1613]: time="2026-01-27T05:55:26.413317829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:26.414381 containerd[1613]: time="2026-01-27T05:55:26.414125899Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.721619199s" Jan 27 05:55:26.414381 containerd[1613]: time="2026-01-27T05:55:26.414167764Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 27 05:55:26.414901 containerd[1613]: time="2026-01-27T05:55:26.414873460Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 27 05:55:26.822595 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3101824030.mount: Deactivated successfully. Jan 27 05:55:26.826922 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 27 05:55:26.831711 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 05:55:27.255000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:27.256429 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:27.265687 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 27 05:55:27.265795 kernel: audit: type=1130 audit(1769493327.255:271): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:27.289899 (kubelet)[2256]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 05:55:27.367751 kubelet[2256]: E0127 05:55:27.367679 2256 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 05:55:27.370882 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 05:55:27.371133 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 05:55:27.396382 kernel: audit: type=1131 audit(1769493327.371:272): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 05:55:27.371000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 05:55:27.372529 systemd[1]: kubelet.service: Consumed 232ms CPU time, 108.6M memory peak. Jan 27 05:55:28.288697 containerd[1613]: time="2026-01-27T05:55:28.288618437Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:28.290400 containerd[1613]: time="2026-01-27T05:55:28.290325404Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=17569975" Jan 27 05:55:28.291342 containerd[1613]: time="2026-01-27T05:55:28.291267531Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:28.296327 containerd[1613]: time="2026-01-27T05:55:28.296162976Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:28.297853 containerd[1613]: time="2026-01-27T05:55:28.297636783Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.882566544s" Jan 27 05:55:28.297853 containerd[1613]: time="2026-01-27T05:55:28.297680340Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 27 05:55:28.298728 containerd[1613]: time="2026-01-27T05:55:28.298556952Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 27 05:55:28.651608 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2246536828.mount: Deactivated successfully. Jan 27 05:55:28.659114 containerd[1613]: time="2026-01-27T05:55:28.659053950Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 05:55:28.660016 containerd[1613]: time="2026-01-27T05:55:28.659976644Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 27 05:55:28.661737 containerd[1613]: time="2026-01-27T05:55:28.661662489Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 05:55:28.664871 containerd[1613]: time="2026-01-27T05:55:28.664777054Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 05:55:28.666281 containerd[1613]: time="2026-01-27T05:55:28.665873046Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 367.271164ms" Jan 27 05:55:28.666281 containerd[1613]: time="2026-01-27T05:55:28.665916854Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 27 05:55:28.666906 containerd[1613]: time="2026-01-27T05:55:28.666629307Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 27 05:55:29.028907 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3211295480.mount: Deactivated successfully. Jan 27 05:55:31.198838 containerd[1613]: time="2026-01-27T05:55:31.198773414Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:31.200299 containerd[1613]: time="2026-01-27T05:55:31.200230200Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=55728979" Jan 27 05:55:31.201745 containerd[1613]: time="2026-01-27T05:55:31.201670542Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:31.205305 containerd[1613]: time="2026-01-27T05:55:31.205228205Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:31.206953 containerd[1613]: time="2026-01-27T05:55:31.206664241Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.539995065s" Jan 27 05:55:31.206953 containerd[1613]: time="2026-01-27T05:55:31.206713566Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 27 05:55:33.544240 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Jan 27 05:55:33.543000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hostnamed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:33.568475 kernel: audit: type=1131 audit(1769493333.543:273): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hostnamed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:33.583000 audit: BPF prog-id=60 op=UNLOAD Jan 27 05:55:33.592397 kernel: audit: type=1334 audit(1769493333.583:274): prog-id=60 op=UNLOAD Jan 27 05:55:35.298384 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:35.298732 systemd[1]: kubelet.service: Consumed 232ms CPU time, 108.6M memory peak. Jan 27 05:55:35.297000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:35.306759 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 05:55:35.297000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:35.342119 kernel: audit: type=1130 audit(1769493335.297:275): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:35.342208 kernel: audit: type=1131 audit(1769493335.297:276): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:35.359397 systemd[1]: Reload requested from client PID 2399 ('systemctl') (unit session-10.scope)... Jan 27 05:55:35.359420 systemd[1]: Reloading... Jan 27 05:55:35.525422 zram_generator::config[2447]: No configuration found. Jan 27 05:55:35.873164 systemd[1]: Reloading finished in 513 ms. Jan 27 05:55:35.918404 kernel: audit: type=1334 audit(1769493335.907:277): prog-id=64 op=LOAD Jan 27 05:55:35.907000 audit: BPF prog-id=64 op=LOAD Jan 27 05:55:35.907000 audit: BPF prog-id=51 op=UNLOAD Jan 27 05:55:35.907000 audit: BPF prog-id=65 op=LOAD Jan 27 05:55:35.935549 kernel: audit: type=1334 audit(1769493335.907:278): prog-id=51 op=UNLOAD Jan 27 05:55:35.935626 kernel: audit: type=1334 audit(1769493335.907:279): prog-id=65 op=LOAD Jan 27 05:55:35.935668 kernel: audit: type=1334 audit(1769493335.907:280): prog-id=66 op=LOAD Jan 27 05:55:35.907000 audit: BPF prog-id=66 op=LOAD Jan 27 05:55:35.942558 kernel: audit: type=1334 audit(1769493335.907:281): prog-id=52 op=UNLOAD Jan 27 05:55:35.907000 audit: BPF prog-id=52 op=UNLOAD Jan 27 05:55:35.949736 kernel: audit: type=1334 audit(1769493335.907:282): prog-id=53 op=UNLOAD Jan 27 05:55:35.907000 audit: BPF prog-id=53 op=UNLOAD Jan 27 05:55:35.909000 audit: BPF prog-id=67 op=LOAD Jan 27 05:55:35.909000 audit: BPF prog-id=42 op=UNLOAD Jan 27 05:55:35.909000 audit: BPF prog-id=68 op=LOAD Jan 27 05:55:35.909000 audit: BPF prog-id=69 op=LOAD Jan 27 05:55:35.909000 audit: BPF prog-id=43 op=UNLOAD Jan 27 05:55:35.909000 audit: BPF prog-id=44 op=UNLOAD Jan 27 05:55:35.916000 audit: BPF prog-id=70 op=LOAD Jan 27 05:55:35.916000 audit: BPF prog-id=57 op=UNLOAD Jan 27 05:55:35.917000 audit: BPF prog-id=71 op=LOAD Jan 27 05:55:35.917000 audit: BPF prog-id=72 op=LOAD Jan 27 05:55:35.917000 audit: BPF prog-id=58 op=UNLOAD Jan 27 05:55:35.917000 audit: BPF prog-id=59 op=UNLOAD Jan 27 05:55:35.920000 audit: BPF prog-id=73 op=LOAD Jan 27 05:55:35.957000 audit: BPF prog-id=48 op=UNLOAD Jan 27 05:55:35.957000 audit: BPF prog-id=74 op=LOAD Jan 27 05:55:35.957000 audit: BPF prog-id=75 op=LOAD Jan 27 05:55:35.957000 audit: BPF prog-id=49 op=UNLOAD Jan 27 05:55:35.957000 audit: BPF prog-id=50 op=UNLOAD Jan 27 05:55:35.957000 audit: BPF prog-id=76 op=LOAD Jan 27 05:55:35.957000 audit: BPF prog-id=77 op=LOAD Jan 27 05:55:35.957000 audit: BPF prog-id=54 op=UNLOAD Jan 27 05:55:35.957000 audit: BPF prog-id=55 op=UNLOAD Jan 27 05:55:35.958000 audit: BPF prog-id=78 op=LOAD Jan 27 05:55:35.958000 audit: BPF prog-id=45 op=UNLOAD Jan 27 05:55:35.958000 audit: BPF prog-id=79 op=LOAD Jan 27 05:55:35.958000 audit: BPF prog-id=80 op=LOAD Jan 27 05:55:35.958000 audit: BPF prog-id=46 op=UNLOAD Jan 27 05:55:35.958000 audit: BPF prog-id=47 op=UNLOAD Jan 27 05:55:35.961000 audit: BPF prog-id=81 op=LOAD Jan 27 05:55:35.961000 audit: BPF prog-id=56 op=UNLOAD Jan 27 05:55:35.963000 audit: BPF prog-id=82 op=LOAD Jan 27 05:55:35.963000 audit: BPF prog-id=63 op=UNLOAD Jan 27 05:55:35.964000 audit: BPF prog-id=83 op=LOAD Jan 27 05:55:35.964000 audit: BPF prog-id=41 op=UNLOAD Jan 27 05:55:35.986932 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 27 05:55:35.987068 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 27 05:55:35.987564 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:35.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 05:55:35.987646 systemd[1]: kubelet.service: Consumed 167ms CPU time, 98.5M memory peak. Jan 27 05:55:35.989902 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 05:55:36.350653 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:36.350000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:36.364138 (kubelet)[2498]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 27 05:55:36.425510 kubelet[2498]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 05:55:36.425510 kubelet[2498]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 27 05:55:36.425510 kubelet[2498]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 05:55:36.426387 kubelet[2498]: I0127 05:55:36.426023 2498 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 05:55:36.819808 kubelet[2498]: I0127 05:55:36.819753 2498 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 27 05:55:36.819808 kubelet[2498]: I0127 05:55:36.819788 2498 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 05:55:36.822399 kubelet[2498]: I0127 05:55:36.820987 2498 server.go:954] "Client rotation is on, will bootstrap in background" Jan 27 05:55:36.881589 kubelet[2498]: E0127 05:55:36.881538 2498 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.128.0.23:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:36.881952 kubelet[2498]: I0127 05:55:36.881892 2498 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 27 05:55:36.891944 kubelet[2498]: I0127 05:55:36.891873 2498 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 05:55:36.896026 kubelet[2498]: I0127 05:55:36.895996 2498 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 27 05:55:36.897684 kubelet[2498]: I0127 05:55:36.897607 2498 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 05:55:36.897938 kubelet[2498]: I0127 05:55:36.897669 2498 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 05:55:36.898154 kubelet[2498]: I0127 05:55:36.897950 2498 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 05:55:36.898154 kubelet[2498]: I0127 05:55:36.897968 2498 container_manager_linux.go:304] "Creating device plugin manager" Jan 27 05:55:36.898154 kubelet[2498]: I0127 05:55:36.898131 2498 state_mem.go:36] "Initialized new in-memory state store" Jan 27 05:55:36.905994 kubelet[2498]: I0127 05:55:36.905935 2498 kubelet.go:446] "Attempting to sync node with API server" Jan 27 05:55:36.908991 kubelet[2498]: I0127 05:55:36.908327 2498 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 05:55:36.908991 kubelet[2498]: I0127 05:55:36.908403 2498 kubelet.go:352] "Adding apiserver pod source" Jan 27 05:55:36.908991 kubelet[2498]: I0127 05:55:36.908421 2498 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 05:55:36.915582 kubelet[2498]: W0127 05:55:36.915521 2498 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.128.0.23:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf&limit=500&resourceVersion=0": dial tcp 10.128.0.23:6443: connect: connection refused Jan 27 05:55:36.915794 kubelet[2498]: E0127 05:55:36.915765 2498 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.128.0.23:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf&limit=500&resourceVersion=0\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:36.916054 kubelet[2498]: I0127 05:55:36.916032 2498 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 27 05:55:36.916858 kubelet[2498]: I0127 05:55:36.916829 2498 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 05:55:36.918240 kubelet[2498]: W0127 05:55:36.918209 2498 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 27 05:55:36.927341 kubelet[2498]: I0127 05:55:36.927316 2498 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 27 05:55:36.927532 kubelet[2498]: I0127 05:55:36.927515 2498 server.go:1287] "Started kubelet" Jan 27 05:55:36.927638 kubelet[2498]: W0127 05:55:36.927570 2498 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.128.0.23:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.128.0.23:6443: connect: connection refused Jan 27 05:55:36.927709 kubelet[2498]: E0127 05:55:36.927663 2498 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.128.0.23:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:36.930388 kubelet[2498]: I0127 05:55:36.930313 2498 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 05:55:36.935000 audit[2509]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2509 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.935000 audit[2509]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffc00baa250 a2=0 a3=0 items=0 ppid=2498 pid=2509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.935000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 27 05:55:36.941000 audit[2510]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2510 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.941000 audit[2510]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff6a91bf60 a2=0 a3=0 items=0 ppid=2498 pid=2510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.941000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 27 05:55:36.943483 kubelet[2498]: I0127 05:55:36.943421 2498 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 27 05:55:36.945723 kubelet[2498]: I0127 05:55:36.945667 2498 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 05:55:36.946744 kubelet[2498]: I0127 05:55:36.946721 2498 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 27 05:55:36.947109 kubelet[2498]: E0127 05:55:36.947083 2498 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" Jan 27 05:55:36.947224 kubelet[2498]: I0127 05:55:36.947152 2498 server.go:479] "Adding debug handlers to kubelet server" Jan 27 05:55:36.948710 kubelet[2498]: I0127 05:55:36.947175 2498 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 05:55:36.947000 audit[2512]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2512 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.947000 audit[2512]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fffe3806fc0 a2=0 a3=0 items=0 ppid=2498 pid=2512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.947000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 05:55:36.949470 kubelet[2498]: I0127 05:55:36.949446 2498 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 05:55:36.950306 kubelet[2498]: I0127 05:55:36.950281 2498 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 27 05:55:36.950500 kubelet[2498]: I0127 05:55:36.950484 2498 reconciler.go:26] "Reconciler: start to sync state" Jan 27 05:55:36.951000 audit[2514]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2514 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.951000 audit[2514]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffdad517b10 a2=0 a3=0 items=0 ppid=2498 pid=2514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.951000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 05:55:36.954841 kubelet[2498]: E0127 05:55:36.951483 2498 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.128.0.23:6443/api/v1/namespaces/default/events\": dial tcp 10.128.0.23:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf.188e80d093fbbb76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,UID:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,},FirstTimestamp:2026-01-27 05:55:36.92748479 +0000 UTC m=+0.557714846,LastTimestamp:2026-01-27 05:55:36.92748479 +0000 UTC m=+0.557714846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,}" Jan 27 05:55:36.955604 kubelet[2498]: I0127 05:55:36.955565 2498 factory.go:221] Registration of the systemd container factory successfully Jan 27 05:55:36.955787 kubelet[2498]: I0127 05:55:36.955757 2498 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 27 05:55:36.959501 kubelet[2498]: W0127 05:55:36.958561 2498 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.128.0.23:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.128.0.23:6443: connect: connection refused Jan 27 05:55:36.959501 kubelet[2498]: E0127 05:55:36.958624 2498 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.128.0.23:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:36.959501 kubelet[2498]: E0127 05:55:36.958717 2498 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.23:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf?timeout=10s\": dial tcp 10.128.0.23:6443: connect: connection refused" interval="200ms" Jan 27 05:55:36.960967 kubelet[2498]: I0127 05:55:36.960941 2498 factory.go:221] Registration of the containerd container factory successfully Jan 27 05:55:36.968000 audit[2517]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2517 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.968000 audit[2517]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffcb37a2d60 a2=0 a3=0 items=0 ppid=2498 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.968000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 27 05:55:36.970688 kubelet[2498]: I0127 05:55:36.970652 2498 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 05:55:36.971000 audit[2519]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2519 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:36.971000 audit[2519]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffda3419a20 a2=0 a3=0 items=0 ppid=2498 pid=2519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.971000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 27 05:55:36.973173 kubelet[2498]: I0127 05:55:36.973149 2498 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 05:55:36.973304 kubelet[2498]: I0127 05:55:36.973280 2498 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 27 05:55:36.973433 kubelet[2498]: I0127 05:55:36.973417 2498 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 27 05:55:36.973515 kubelet[2498]: I0127 05:55:36.973503 2498 kubelet.go:2382] "Starting kubelet main sync loop" Jan 27 05:55:36.973656 kubelet[2498]: E0127 05:55:36.973633 2498 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 05:55:36.973000 audit[2520]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2520 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.973000 audit[2520]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc45aa7a90 a2=0 a3=0 items=0 ppid=2498 pid=2520 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.973000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 27 05:55:36.975000 audit[2521]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2521 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.975000 audit[2521]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc900b5270 a2=0 a3=0 items=0 ppid=2498 pid=2521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.975000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 27 05:55:36.976000 audit[2522]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2522 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:36.976000 audit[2522]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc4f9fd8c0 a2=0 a3=0 items=0 ppid=2498 pid=2522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.976000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 27 05:55:36.978000 audit[2523]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2523 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:36.978000 audit[2523]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc05d13ff0 a2=0 a3=0 items=0 ppid=2498 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.978000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 27 05:55:36.979000 audit[2524]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2524 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:36.979000 audit[2524]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc18a69160 a2=0 a3=0 items=0 ppid=2498 pid=2524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.979000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 27 05:55:36.981000 audit[2525]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2525 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:36.981000 audit[2525]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe5174d8d0 a2=0 a3=0 items=0 ppid=2498 pid=2525 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:36.981000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 27 05:55:36.984929 kubelet[2498]: W0127 05:55:36.984162 2498 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.128.0.23:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.128.0.23:6443: connect: connection refused Jan 27 05:55:36.984929 kubelet[2498]: E0127 05:55:36.984248 2498 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.128.0.23:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:36.984929 kubelet[2498]: E0127 05:55:36.984397 2498 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 27 05:55:37.000098 kubelet[2498]: I0127 05:55:37.000066 2498 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 27 05:55:37.000348 kubelet[2498]: I0127 05:55:37.000239 2498 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 27 05:55:37.000348 kubelet[2498]: I0127 05:55:37.000271 2498 state_mem.go:36] "Initialized new in-memory state store" Jan 27 05:55:37.048534 kubelet[2498]: E0127 05:55:37.048466 2498 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" Jan 27 05:55:37.075344 kubelet[2498]: E0127 05:55:37.075175 2498 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 27 05:55:37.116214 kubelet[2498]: I0127 05:55:37.116168 2498 policy_none.go:49] "None policy: Start" Jan 27 05:55:37.116214 kubelet[2498]: I0127 05:55:37.116221 2498 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 27 05:55:37.116480 kubelet[2498]: I0127 05:55:37.116248 2498 state_mem.go:35] "Initializing new in-memory state store" Jan 27 05:55:37.149658 kubelet[2498]: E0127 05:55:37.149582 2498 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" Jan 27 05:55:37.159705 kubelet[2498]: E0127 05:55:37.159648 2498 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.23:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf?timeout=10s\": dial tcp 10.128.0.23:6443: connect: connection refused" interval="400ms" Jan 27 05:55:37.214187 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 27 05:55:37.228178 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 27 05:55:37.233117 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 27 05:55:37.246745 kubelet[2498]: I0127 05:55:37.246499 2498 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 05:55:37.246873 kubelet[2498]: I0127 05:55:37.246772 2498 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 05:55:37.246873 kubelet[2498]: I0127 05:55:37.246791 2498 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 05:55:37.248958 kubelet[2498]: I0127 05:55:37.248479 2498 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 05:55:37.249076 kubelet[2498]: E0127 05:55:37.249038 2498 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 27 05:55:37.249165 kubelet[2498]: E0127 05:55:37.249098 2498 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" Jan 27 05:55:37.295131 systemd[1]: Created slice kubepods-burstable-pod4b6cc33d9cebe6713adb74370d3180a3.slice - libcontainer container kubepods-burstable-pod4b6cc33d9cebe6713adb74370d3180a3.slice. Jan 27 05:55:37.312461 kubelet[2498]: E0127 05:55:37.312420 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.318466 systemd[1]: Created slice kubepods-burstable-pod6db618a671879346c6fb373896e7fc78.slice - libcontainer container kubepods-burstable-pod6db618a671879346c6fb373896e7fc78.slice. Jan 27 05:55:37.322416 kubelet[2498]: E0127 05:55:37.322172 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.325221 systemd[1]: Created slice kubepods-burstable-podfd705fb348d77328eefb16f73cbcb849.slice - libcontainer container kubepods-burstable-podfd705fb348d77328eefb16f73cbcb849.slice. Jan 27 05:55:37.328736 kubelet[2498]: E0127 05:55:37.328688 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.352341 kubelet[2498]: I0127 05:55:37.352305 2498 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.352765 kubelet[2498]: I0127 05:55:37.352733 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-kubeconfig\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.352892 kubelet[2498]: I0127 05:55:37.352840 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd705fb348d77328eefb16f73cbcb849-kubeconfig\") pod \"kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"fd705fb348d77328eefb16f73cbcb849\") " pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.352892 kubelet[2498]: I0127 05:55:37.352884 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-ca-certs\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.353015 kubelet[2498]: I0127 05:55:37.352914 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b6cc33d9cebe6713adb74370d3180a3-k8s-certs\") pod \"kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"4b6cc33d9cebe6713adb74370d3180a3\") " pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.353015 kubelet[2498]: I0127 05:55:37.352945 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b6cc33d9cebe6713adb74370d3180a3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"4b6cc33d9cebe6713adb74370d3180a3\") " pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.353015 kubelet[2498]: I0127 05:55:37.352976 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-flexvolume-dir\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.353213 kubelet[2498]: I0127 05:55:37.353005 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-k8s-certs\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.354120 kubelet[2498]: E0127 05:55:37.352751 2498 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.128.0.23:6443/api/v1/nodes\": dial tcp 10.128.0.23:6443: connect: connection refused" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.354120 kubelet[2498]: I0127 05:55:37.353475 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.354120 kubelet[2498]: I0127 05:55:37.353649 2498 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b6cc33d9cebe6713adb74370d3180a3-ca-certs\") pod \"kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"4b6cc33d9cebe6713adb74370d3180a3\") " pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.559978 kubelet[2498]: I0127 05:55:37.559629 2498 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.560548 kubelet[2498]: E0127 05:55:37.560088 2498 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.23:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf?timeout=10s\": dial tcp 10.128.0.23:6443: connect: connection refused" interval="800ms" Jan 27 05:55:37.560548 kubelet[2498]: E0127 05:55:37.560187 2498 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.128.0.23:6443/api/v1/nodes\": dial tcp 10.128.0.23:6443: connect: connection refused" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.614626 containerd[1613]: time="2026-01-27T05:55:37.614480873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,Uid:4b6cc33d9cebe6713adb74370d3180a3,Namespace:kube-system,Attempt:0,}" Jan 27 05:55:37.624337 containerd[1613]: time="2026-01-27T05:55:37.624280047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,Uid:6db618a671879346c6fb373896e7fc78,Namespace:kube-system,Attempt:0,}" Jan 27 05:55:37.630509 containerd[1613]: time="2026-01-27T05:55:37.630279423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,Uid:fd705fb348d77328eefb16f73cbcb849,Namespace:kube-system,Attempt:0,}" Jan 27 05:55:37.653319 containerd[1613]: time="2026-01-27T05:55:37.653267232Z" level=info msg="connecting to shim f1308304ba9e0553ed3cac12143fd92dc5679140f8a7604b31cae7b722c27036" address="unix:///run/containerd/s/7bd17936fbd3594fd1562acb72b1edb9ba5ea21e07e18bced5b5401f432277d9" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:55:37.703532 containerd[1613]: time="2026-01-27T05:55:37.703471792Z" level=info msg="connecting to shim 8e30a8ec47e514a93fa7c19d94cddf7b35dfb5f00e41549b41a809cef6eb1d63" address="unix:///run/containerd/s/5dac0f46514512d0bd4298d37c0caacba83be12fc85c66ae8b44c861d8e195fe" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:55:37.705602 containerd[1613]: time="2026-01-27T05:55:37.705560068Z" level=info msg="connecting to shim e2142315b49a7944435a120eee2b5a6a752c659c84267e558b1f1ae90a772b94" address="unix:///run/containerd/s/ef60f39192866110195d26506f774d8c3babe94a7474dd9aa84a997b72ee3aa4" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:55:37.727652 systemd[1]: Started cri-containerd-f1308304ba9e0553ed3cac12143fd92dc5679140f8a7604b31cae7b722c27036.scope - libcontainer container f1308304ba9e0553ed3cac12143fd92dc5679140f8a7604b31cae7b722c27036. Jan 27 05:55:37.760000 audit: BPF prog-id=84 op=LOAD Jan 27 05:55:37.763000 audit: BPF prog-id=85 op=LOAD Jan 27 05:55:37.763000 audit[2550]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=2537 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631333038333034626139653035353365643363616331323134336664 Jan 27 05:55:37.763000 audit: BPF prog-id=85 op=UNLOAD Jan 27 05:55:37.763000 audit[2550]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2537 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631333038333034626139653035353365643363616331323134336664 Jan 27 05:55:37.765000 audit: BPF prog-id=86 op=LOAD Jan 27 05:55:37.765000 audit[2550]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=2537 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.765000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631333038333034626139653035353365643363616331323134336664 Jan 27 05:55:37.766000 audit: BPF prog-id=87 op=LOAD Jan 27 05:55:37.766000 audit[2550]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=2537 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.766000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631333038333034626139653035353365643363616331323134336664 Jan 27 05:55:37.766000 audit: BPF prog-id=87 op=UNLOAD Jan 27 05:55:37.766000 audit[2550]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2537 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.766000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631333038333034626139653035353365643363616331323134336664 Jan 27 05:55:37.766000 audit: BPF prog-id=86 op=UNLOAD Jan 27 05:55:37.766000 audit[2550]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2537 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.766000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631333038333034626139653035353365643363616331323134336664 Jan 27 05:55:37.766000 audit: BPF prog-id=88 op=LOAD Jan 27 05:55:37.766000 audit[2550]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=2537 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.766000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631333038333034626139653035353365643363616331323134336664 Jan 27 05:55:37.782858 systemd[1]: Started cri-containerd-8e30a8ec47e514a93fa7c19d94cddf7b35dfb5f00e41549b41a809cef6eb1d63.scope - libcontainer container 8e30a8ec47e514a93fa7c19d94cddf7b35dfb5f00e41549b41a809cef6eb1d63. Jan 27 05:55:37.796646 systemd[1]: Started cri-containerd-e2142315b49a7944435a120eee2b5a6a752c659c84267e558b1f1ae90a772b94.scope - libcontainer container e2142315b49a7944435a120eee2b5a6a752c659c84267e558b1f1ae90a772b94. Jan 27 05:55:37.813000 audit: BPF prog-id=89 op=LOAD Jan 27 05:55:37.814000 audit: BPF prog-id=90 op=LOAD Jan 27 05:55:37.814000 audit[2599]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865333061386563343765353134613933666137633139643934636464 Jan 27 05:55:37.814000 audit: BPF prog-id=90 op=UNLOAD Jan 27 05:55:37.814000 audit[2599]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865333061386563343765353134613933666137633139643934636464 Jan 27 05:55:37.815000 audit: BPF prog-id=91 op=LOAD Jan 27 05:55:37.815000 audit[2599]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865333061386563343765353134613933666137633139643934636464 Jan 27 05:55:37.815000 audit: BPF prog-id=92 op=LOAD Jan 27 05:55:37.815000 audit[2599]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865333061386563343765353134613933666137633139643934636464 Jan 27 05:55:37.815000 audit: BPF prog-id=92 op=UNLOAD Jan 27 05:55:37.815000 audit[2599]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865333061386563343765353134613933666137633139643934636464 Jan 27 05:55:37.815000 audit: BPF prog-id=91 op=UNLOAD Jan 27 05:55:37.815000 audit[2599]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865333061386563343765353134613933666137633139643934636464 Jan 27 05:55:37.815000 audit: BPF prog-id=93 op=LOAD Jan 27 05:55:37.815000 audit[2599]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.815000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865333061386563343765353134613933666137633139643934636464 Jan 27 05:55:37.831000 audit: BPF prog-id=94 op=LOAD Jan 27 05:55:37.834000 audit: BPF prog-id=95 op=LOAD Jan 27 05:55:37.834000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2572 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.834000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532313432333135623439613739343434333561313230656565326235 Jan 27 05:55:37.834000 audit: BPF prog-id=95 op=UNLOAD Jan 27 05:55:37.834000 audit[2600]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2572 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.834000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532313432333135623439613739343434333561313230656565326235 Jan 27 05:55:37.835000 audit: BPF prog-id=96 op=LOAD Jan 27 05:55:37.835000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2572 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532313432333135623439613739343434333561313230656565326235 Jan 27 05:55:37.836000 audit: BPF prog-id=97 op=LOAD Jan 27 05:55:37.836000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2572 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532313432333135623439613739343434333561313230656565326235 Jan 27 05:55:37.836000 audit: BPF prog-id=97 op=UNLOAD Jan 27 05:55:37.836000 audit[2600]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2572 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532313432333135623439613739343434333561313230656565326235 Jan 27 05:55:37.836000 audit: BPF prog-id=96 op=UNLOAD Jan 27 05:55:37.836000 audit[2600]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2572 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532313432333135623439613739343434333561313230656565326235 Jan 27 05:55:37.836000 audit: BPF prog-id=98 op=LOAD Jan 27 05:55:37.836000 audit[2600]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2572 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532313432333135623439613739343434333561313230656565326235 Jan 27 05:55:37.865263 containerd[1613]: time="2026-01-27T05:55:37.865006696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,Uid:4b6cc33d9cebe6713adb74370d3180a3,Namespace:kube-system,Attempt:0,} returns sandbox id \"f1308304ba9e0553ed3cac12143fd92dc5679140f8a7604b31cae7b722c27036\"" Jan 27 05:55:37.873234 kubelet[2498]: E0127 05:55:37.872806 2498 kubelet_pods.go:555] "Hostname for pod was too long, truncated it" podName="kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" hostnameMaxLen=63 truncatedHostname="kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a127" Jan 27 05:55:37.874959 containerd[1613]: time="2026-01-27T05:55:37.874630299Z" level=info msg="CreateContainer within sandbox \"f1308304ba9e0553ed3cac12143fd92dc5679140f8a7604b31cae7b722c27036\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 27 05:55:37.891432 kubelet[2498]: W0127 05:55:37.891292 2498 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.128.0.23:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.128.0.23:6443: connect: connection refused Jan 27 05:55:37.892193 kubelet[2498]: E0127 05:55:37.891576 2498 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.128.0.23:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:37.895303 containerd[1613]: time="2026-01-27T05:55:37.895256289Z" level=info msg="Container 23928e245d100dc44daf0010930b878abf0b855cb8af6320b038c0da53a4d942: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:55:37.908204 containerd[1613]: time="2026-01-27T05:55:37.908165375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,Uid:6db618a671879346c6fb373896e7fc78,Namespace:kube-system,Attempt:0,} returns sandbox id \"8e30a8ec47e514a93fa7c19d94cddf7b35dfb5f00e41549b41a809cef6eb1d63\"" Jan 27 05:55:37.908862 containerd[1613]: time="2026-01-27T05:55:37.908829767Z" level=info msg="CreateContainer within sandbox \"f1308304ba9e0553ed3cac12143fd92dc5679140f8a7604b31cae7b722c27036\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"23928e245d100dc44daf0010930b878abf0b855cb8af6320b038c0da53a4d942\"" Jan 27 05:55:37.910416 containerd[1613]: time="2026-01-27T05:55:37.910384531Z" level=info msg="StartContainer for \"23928e245d100dc44daf0010930b878abf0b855cb8af6320b038c0da53a4d942\"" Jan 27 05:55:37.911654 kubelet[2498]: E0127 05:55:37.910999 2498 kubelet_pods.go:555] "Hostname for pod was too long, truncated it" podName="kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" hostnameMaxLen=63 truncatedHostname="kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea" Jan 27 05:55:37.914483 containerd[1613]: time="2026-01-27T05:55:37.914427248Z" level=info msg="connecting to shim 23928e245d100dc44daf0010930b878abf0b855cb8af6320b038c0da53a4d942" address="unix:///run/containerd/s/7bd17936fbd3594fd1562acb72b1edb9ba5ea21e07e18bced5b5401f432277d9" protocol=ttrpc version=3 Jan 27 05:55:37.915122 containerd[1613]: time="2026-01-27T05:55:37.915081113Z" level=info msg="CreateContainer within sandbox \"8e30a8ec47e514a93fa7c19d94cddf7b35dfb5f00e41549b41a809cef6eb1d63\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 27 05:55:37.930238 containerd[1613]: time="2026-01-27T05:55:37.930137155Z" level=info msg="Container 23097fa10208c6618f58ae6132ee2cb9ad7204e20c2d46e935edced1876b3a3b: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:55:37.943673 systemd[1]: Started cri-containerd-23928e245d100dc44daf0010930b878abf0b855cb8af6320b038c0da53a4d942.scope - libcontainer container 23928e245d100dc44daf0010930b878abf0b855cb8af6320b038c0da53a4d942. Jan 27 05:55:37.954140 containerd[1613]: time="2026-01-27T05:55:37.953350233Z" level=info msg="CreateContainer within sandbox \"8e30a8ec47e514a93fa7c19d94cddf7b35dfb5f00e41549b41a809cef6eb1d63\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"23097fa10208c6618f58ae6132ee2cb9ad7204e20c2d46e935edced1876b3a3b\"" Jan 27 05:55:37.956726 containerd[1613]: time="2026-01-27T05:55:37.956693904Z" level=info msg="StartContainer for \"23097fa10208c6618f58ae6132ee2cb9ad7204e20c2d46e935edced1876b3a3b\"" Jan 27 05:55:37.964005 containerd[1613]: time="2026-01-27T05:55:37.963912567Z" level=info msg="connecting to shim 23097fa10208c6618f58ae6132ee2cb9ad7204e20c2d46e935edced1876b3a3b" address="unix:///run/containerd/s/5dac0f46514512d0bd4298d37c0caacba83be12fc85c66ae8b44c861d8e195fe" protocol=ttrpc version=3 Jan 27 05:55:37.970996 kubelet[2498]: I0127 05:55:37.970925 2498 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.971954 kubelet[2498]: E0127 05:55:37.971784 2498 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.128.0.23:6443/api/v1/nodes\": dial tcp 10.128.0.23:6443: connect: connection refused" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:37.974549 containerd[1613]: time="2026-01-27T05:55:37.974494964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf,Uid:fd705fb348d77328eefb16f73cbcb849,Namespace:kube-system,Attempt:0,} returns sandbox id \"e2142315b49a7944435a120eee2b5a6a752c659c84267e558b1f1ae90a772b94\"" Jan 27 05:55:37.985383 kubelet[2498]: E0127 05:55:37.984781 2498 kubelet_pods.go:555] "Hostname for pod was too long, truncated it" podName="kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" hostnameMaxLen=63 truncatedHostname="kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a127" Jan 27 05:55:37.986091 containerd[1613]: time="2026-01-27T05:55:37.986044893Z" level=info msg="CreateContainer within sandbox \"e2142315b49a7944435a120eee2b5a6a752c659c84267e558b1f1ae90a772b94\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 27 05:55:37.989000 audit: BPF prog-id=99 op=LOAD Jan 27 05:55:37.995005 kubelet[2498]: W0127 05:55:37.994888 2498 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.128.0.23:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.128.0.23:6443: connect: connection refused Jan 27 05:55:37.995484 kubelet[2498]: E0127 05:55:37.995451 2498 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.128.0.23:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:37.993000 audit: BPF prog-id=100 op=LOAD Jan 27 05:55:37.993000 audit[2657]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2537 pid=2657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233393238653234356431303064633434646166303031303933306238 Jan 27 05:55:37.994000 audit: BPF prog-id=100 op=UNLOAD Jan 27 05:55:37.994000 audit[2657]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2537 pid=2657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.994000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233393238653234356431303064633434646166303031303933306238 Jan 27 05:55:37.995000 audit: BPF prog-id=101 op=LOAD Jan 27 05:55:37.995000 audit[2657]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2537 pid=2657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.995000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233393238653234356431303064633434646166303031303933306238 Jan 27 05:55:37.995000 audit: BPF prog-id=102 op=LOAD Jan 27 05:55:37.995000 audit[2657]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2537 pid=2657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.995000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233393238653234356431303064633434646166303031303933306238 Jan 27 05:55:37.996000 audit: BPF prog-id=102 op=UNLOAD Jan 27 05:55:37.996000 audit[2657]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2537 pid=2657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.996000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233393238653234356431303064633434646166303031303933306238 Jan 27 05:55:37.996000 audit: BPF prog-id=101 op=UNLOAD Jan 27 05:55:37.996000 audit[2657]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2537 pid=2657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.996000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233393238653234356431303064633434646166303031303933306238 Jan 27 05:55:37.996000 audit: BPF prog-id=103 op=LOAD Jan 27 05:55:37.996000 audit[2657]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2537 pid=2657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:37.996000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233393238653234356431303064633434646166303031303933306238 Jan 27 05:55:38.011722 containerd[1613]: time="2026-01-27T05:55:38.011685812Z" level=info msg="Container e3c9c3a73c956699e0bc9a45ac0131c3d063754b055096173680b97ec1d065d6: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:55:38.013825 systemd[1]: Started cri-containerd-23097fa10208c6618f58ae6132ee2cb9ad7204e20c2d46e935edced1876b3a3b.scope - libcontainer container 23097fa10208c6618f58ae6132ee2cb9ad7204e20c2d46e935edced1876b3a3b. Jan 27 05:55:38.026256 kubelet[2498]: W0127 05:55:38.026188 2498 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.128.0.23:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.128.0.23:6443: connect: connection refused Jan 27 05:55:38.026383 kubelet[2498]: E0127 05:55:38.026277 2498 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.128.0.23:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.128.0.23:6443: connect: connection refused" logger="UnhandledError" Jan 27 05:55:38.030231 containerd[1613]: time="2026-01-27T05:55:38.030182919Z" level=info msg="CreateContainer within sandbox \"e2142315b49a7944435a120eee2b5a6a752c659c84267e558b1f1ae90a772b94\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e3c9c3a73c956699e0bc9a45ac0131c3d063754b055096173680b97ec1d065d6\"" Jan 27 05:55:38.031179 containerd[1613]: time="2026-01-27T05:55:38.031055066Z" level=info msg="StartContainer for \"e3c9c3a73c956699e0bc9a45ac0131c3d063754b055096173680b97ec1d065d6\"" Jan 27 05:55:38.034685 containerd[1613]: time="2026-01-27T05:55:38.034652325Z" level=info msg="connecting to shim e3c9c3a73c956699e0bc9a45ac0131c3d063754b055096173680b97ec1d065d6" address="unix:///run/containerd/s/ef60f39192866110195d26506f774d8c3babe94a7474dd9aa84a997b72ee3aa4" protocol=ttrpc version=3 Jan 27 05:55:38.068000 audit: BPF prog-id=104 op=LOAD Jan 27 05:55:38.069000 audit: BPF prog-id=105 op=LOAD Jan 27 05:55:38.069000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2570 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233303937666131303230386336363138663538616536313332656532 Jan 27 05:55:38.069000 audit: BPF prog-id=105 op=UNLOAD Jan 27 05:55:38.069000 audit[2683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233303937666131303230386336363138663538616536313332656532 Jan 27 05:55:38.069000 audit: BPF prog-id=106 op=LOAD Jan 27 05:55:38.069000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2570 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233303937666131303230386336363138663538616536313332656532 Jan 27 05:55:38.070000 audit: BPF prog-id=107 op=LOAD Jan 27 05:55:38.070000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2570 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233303937666131303230386336363138663538616536313332656532 Jan 27 05:55:38.070000 audit: BPF prog-id=107 op=UNLOAD Jan 27 05:55:38.070000 audit[2683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233303937666131303230386336363138663538616536313332656532 Jan 27 05:55:38.071000 audit: BPF prog-id=106 op=UNLOAD Jan 27 05:55:38.071000 audit[2683]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2570 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.071000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233303937666131303230386336363138663538616536313332656532 Jan 27 05:55:38.071000 audit: BPF prog-id=108 op=LOAD Jan 27 05:55:38.071000 audit[2683]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2570 pid=2683 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.071000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233303937666131303230386336363138663538616536313332656532 Jan 27 05:55:38.085724 systemd[1]: Started cri-containerd-e3c9c3a73c956699e0bc9a45ac0131c3d063754b055096173680b97ec1d065d6.scope - libcontainer container e3c9c3a73c956699e0bc9a45ac0131c3d063754b055096173680b97ec1d065d6. Jan 27 05:55:38.108750 containerd[1613]: time="2026-01-27T05:55:38.108704044Z" level=info msg="StartContainer for \"23928e245d100dc44daf0010930b878abf0b855cb8af6320b038c0da53a4d942\" returns successfully" Jan 27 05:55:38.114000 audit: BPF prog-id=109 op=LOAD Jan 27 05:55:38.115000 audit: BPF prog-id=110 op=LOAD Jan 27 05:55:38.115000 audit[2705]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2572 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533633963336137336339353636393965306263396134356163303133 Jan 27 05:55:38.116000 audit: BPF prog-id=110 op=UNLOAD Jan 27 05:55:38.116000 audit[2705]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2572 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533633963336137336339353636393965306263396134356163303133 Jan 27 05:55:38.117000 audit: BPF prog-id=111 op=LOAD Jan 27 05:55:38.117000 audit[2705]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2572 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533633963336137336339353636393965306263396134356163303133 Jan 27 05:55:38.117000 audit: BPF prog-id=112 op=LOAD Jan 27 05:55:38.117000 audit[2705]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2572 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533633963336137336339353636393965306263396134356163303133 Jan 27 05:55:38.118000 audit: BPF prog-id=112 op=UNLOAD Jan 27 05:55:38.118000 audit[2705]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2572 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533633963336137336339353636393965306263396134356163303133 Jan 27 05:55:38.118000 audit: BPF prog-id=111 op=UNLOAD Jan 27 05:55:38.118000 audit[2705]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2572 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533633963336137336339353636393965306263396134356163303133 Jan 27 05:55:38.118000 audit: BPF prog-id=113 op=LOAD Jan 27 05:55:38.118000 audit[2705]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2572 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:38.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533633963336137336339353636393965306263396134356163303133 Jan 27 05:55:38.190278 containerd[1613]: time="2026-01-27T05:55:38.188758421Z" level=info msg="StartContainer for \"23097fa10208c6618f58ae6132ee2cb9ad7204e20c2d46e935edced1876b3a3b\" returns successfully" Jan 27 05:55:38.221546 containerd[1613]: time="2026-01-27T05:55:38.220813051Z" level=info msg="StartContainer for \"e3c9c3a73c956699e0bc9a45ac0131c3d063754b055096173680b97ec1d065d6\" returns successfully" Jan 27 05:55:38.777055 kubelet[2498]: I0127 05:55:38.776668 2498 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:39.040875 kubelet[2498]: E0127 05:55:39.039099 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:39.041902 kubelet[2498]: E0127 05:55:39.041669 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:39.047749 kubelet[2498]: E0127 05:55:39.047704 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:40.053004 kubelet[2498]: E0127 05:55:40.052689 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:40.054130 kubelet[2498]: E0127 05:55:40.053500 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:40.054836 kubelet[2498]: E0127 05:55:40.054671 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.058160 kubelet[2498]: E0127 05:55:41.057734 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.061111 kubelet[2498]: E0127 05:55:41.061081 2498 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.156628 kubelet[2498]: E0127 05:55:41.156580 2498 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.219451 kubelet[2498]: I0127 05:55:41.219399 2498 kubelet_node_status.go:78] "Successfully registered node" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.251389 kubelet[2498]: I0127 05:55:41.250494 2498 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.332514 kubelet[2498]: E0127 05:55:41.332348 2498 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.332514 kubelet[2498]: I0127 05:55:41.332433 2498 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.343936 kubelet[2498]: E0127 05:55:41.343893 2498 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.343936 kubelet[2498]: I0127 05:55:41.343933 2498 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.350420 kubelet[2498]: E0127 05:55:41.350382 2498 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:41.930923 kubelet[2498]: I0127 05:55:41.930876 2498 apiserver.go:52] "Watching apiserver" Jan 27 05:55:41.950613 kubelet[2498]: I0127 05:55:41.950569 2498 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 27 05:55:42.056226 kubelet[2498]: I0127 05:55:42.055846 2498 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:42.074673 kubelet[2498]: W0127 05:55:42.074636 2498 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Jan 27 05:55:43.348248 systemd[1]: Reload requested from client PID 2762 ('systemctl') (unit session-10.scope)... Jan 27 05:55:43.348274 systemd[1]: Reloading... Jan 27 05:55:43.602465 zram_generator::config[2809]: No configuration found. Jan 27 05:55:44.113875 systemd[1]: Reloading finished in 764 ms. Jan 27 05:55:44.167993 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 05:55:44.181266 systemd[1]: kubelet.service: Deactivated successfully. Jan 27 05:55:44.181753 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:44.180000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:44.182372 systemd[1]: kubelet.service: Consumed 1.056s CPU time, 129.9M memory peak. Jan 27 05:55:44.187319 kernel: kauditd_printk_skb: 204 callbacks suppressed Jan 27 05:55:44.187443 kernel: audit: type=1131 audit(1769493344.180:379): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:44.192832 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 05:55:44.190000 audit: BPF prog-id=114 op=LOAD Jan 27 05:55:44.190000 audit: BPF prog-id=83 op=UNLOAD Jan 27 05:55:44.190000 audit: BPF prog-id=115 op=LOAD Jan 27 05:55:44.190000 audit: BPF prog-id=67 op=UNLOAD Jan 27 05:55:44.190000 audit: BPF prog-id=116 op=LOAD Jan 27 05:55:44.190000 audit: BPF prog-id=117 op=LOAD Jan 27 05:55:44.190000 audit: BPF prog-id=68 op=UNLOAD Jan 27 05:55:44.195000 audit: BPF prog-id=69 op=UNLOAD Jan 27 05:55:44.225257 kernel: audit: type=1334 audit(1769493344.190:380): prog-id=114 op=LOAD Jan 27 05:55:44.225378 kernel: audit: type=1334 audit(1769493344.190:381): prog-id=83 op=UNLOAD Jan 27 05:55:44.225427 kernel: audit: type=1334 audit(1769493344.190:382): prog-id=115 op=LOAD Jan 27 05:55:44.225470 kernel: audit: type=1334 audit(1769493344.190:383): prog-id=67 op=UNLOAD Jan 27 05:55:44.225504 kernel: audit: type=1334 audit(1769493344.190:384): prog-id=116 op=LOAD Jan 27 05:55:44.225541 kernel: audit: type=1334 audit(1769493344.190:385): prog-id=117 op=LOAD Jan 27 05:55:44.225579 kernel: audit: type=1334 audit(1769493344.190:386): prog-id=68 op=UNLOAD Jan 27 05:55:44.225615 kernel: audit: type=1334 audit(1769493344.195:387): prog-id=69 op=UNLOAD Jan 27 05:55:44.225658 kernel: audit: type=1334 audit(1769493344.200:388): prog-id=118 op=LOAD Jan 27 05:55:44.200000 audit: BPF prog-id=118 op=LOAD Jan 27 05:55:44.200000 audit: BPF prog-id=73 op=UNLOAD Jan 27 05:55:44.200000 audit: BPF prog-id=119 op=LOAD Jan 27 05:55:44.200000 audit: BPF prog-id=120 op=LOAD Jan 27 05:55:44.200000 audit: BPF prog-id=74 op=UNLOAD Jan 27 05:55:44.200000 audit: BPF prog-id=75 op=UNLOAD Jan 27 05:55:44.215000 audit: BPF prog-id=121 op=LOAD Jan 27 05:55:44.215000 audit: BPF prog-id=70 op=UNLOAD Jan 27 05:55:44.215000 audit: BPF prog-id=122 op=LOAD Jan 27 05:55:44.215000 audit: BPF prog-id=123 op=LOAD Jan 27 05:55:44.215000 audit: BPF prog-id=71 op=UNLOAD Jan 27 05:55:44.215000 audit: BPF prog-id=72 op=UNLOAD Jan 27 05:55:44.215000 audit: BPF prog-id=124 op=LOAD Jan 27 05:55:44.215000 audit: BPF prog-id=64 op=UNLOAD Jan 27 05:55:44.215000 audit: BPF prog-id=125 op=LOAD Jan 27 05:55:44.215000 audit: BPF prog-id=126 op=LOAD Jan 27 05:55:44.215000 audit: BPF prog-id=65 op=UNLOAD Jan 27 05:55:44.215000 audit: BPF prog-id=66 op=UNLOAD Jan 27 05:55:44.220000 audit: BPF prog-id=127 op=LOAD Jan 27 05:55:44.220000 audit: BPF prog-id=81 op=UNLOAD Jan 27 05:55:44.277000 audit: BPF prog-id=128 op=LOAD Jan 27 05:55:44.277000 audit: BPF prog-id=82 op=UNLOAD Jan 27 05:55:44.277000 audit: BPF prog-id=129 op=LOAD Jan 27 05:55:44.277000 audit: BPF prog-id=130 op=LOAD Jan 27 05:55:44.278000 audit: BPF prog-id=76 op=UNLOAD Jan 27 05:55:44.278000 audit: BPF prog-id=77 op=UNLOAD Jan 27 05:55:44.279000 audit: BPF prog-id=131 op=LOAD Jan 27 05:55:44.279000 audit: BPF prog-id=78 op=UNLOAD Jan 27 05:55:44.279000 audit: BPF prog-id=132 op=LOAD Jan 27 05:55:44.279000 audit: BPF prog-id=133 op=LOAD Jan 27 05:55:44.279000 audit: BPF prog-id=79 op=UNLOAD Jan 27 05:55:44.279000 audit: BPF prog-id=80 op=UNLOAD Jan 27 05:55:44.629776 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 05:55:44.631000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:44.646874 (kubelet)[2857]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 27 05:55:44.722901 kubelet[2857]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 05:55:44.722901 kubelet[2857]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 27 05:55:44.722901 kubelet[2857]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 05:55:44.723480 kubelet[2857]: I0127 05:55:44.723048 2857 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 05:55:44.735735 kubelet[2857]: I0127 05:55:44.735540 2857 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 27 05:55:44.735735 kubelet[2857]: I0127 05:55:44.735573 2857 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 05:55:44.737397 kubelet[2857]: I0127 05:55:44.736573 2857 server.go:954] "Client rotation is on, will bootstrap in background" Jan 27 05:55:44.738845 kubelet[2857]: I0127 05:55:44.738817 2857 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 27 05:55:44.748397 kubelet[2857]: I0127 05:55:44.747889 2857 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 27 05:55:44.755638 kubelet[2857]: I0127 05:55:44.755592 2857 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 05:55:44.762294 kubelet[2857]: I0127 05:55:44.760826 2857 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 27 05:55:44.762294 kubelet[2857]: I0127 05:55:44.761226 2857 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 05:55:44.762294 kubelet[2857]: I0127 05:55:44.761267 2857 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 05:55:44.762294 kubelet[2857]: I0127 05:55:44.762113 2857 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 05:55:44.762676 kubelet[2857]: I0127 05:55:44.762134 2857 container_manager_linux.go:304] "Creating device plugin manager" Jan 27 05:55:44.762676 kubelet[2857]: I0127 05:55:44.762197 2857 state_mem.go:36] "Initialized new in-memory state store" Jan 27 05:55:44.762676 kubelet[2857]: I0127 05:55:44.762519 2857 kubelet.go:446] "Attempting to sync node with API server" Jan 27 05:55:44.762676 kubelet[2857]: I0127 05:55:44.762547 2857 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 05:55:44.762676 kubelet[2857]: I0127 05:55:44.762581 2857 kubelet.go:352] "Adding apiserver pod source" Jan 27 05:55:44.762676 kubelet[2857]: I0127 05:55:44.762596 2857 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 05:55:44.764934 kubelet[2857]: I0127 05:55:44.764910 2857 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 27 05:55:44.767116 kubelet[2857]: I0127 05:55:44.767090 2857 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 05:55:44.768793 kubelet[2857]: I0127 05:55:44.768770 2857 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 27 05:55:44.769404 kubelet[2857]: I0127 05:55:44.768928 2857 server.go:1287] "Started kubelet" Jan 27 05:55:44.779607 kubelet[2857]: I0127 05:55:44.779584 2857 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 05:55:44.789279 kubelet[2857]: I0127 05:55:44.788511 2857 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 05:55:44.793119 kubelet[2857]: I0127 05:55:44.793090 2857 server.go:479] "Adding debug handlers to kubelet server" Jan 27 05:55:44.800505 kubelet[2857]: I0127 05:55:44.799512 2857 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 05:55:44.801697 kubelet[2857]: I0127 05:55:44.800795 2857 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 05:55:44.801697 kubelet[2857]: E0127 05:55:44.800479 2857 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 27 05:55:44.802069 kubelet[2857]: I0127 05:55:44.802048 2857 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 27 05:55:44.813328 kubelet[2857]: I0127 05:55:44.813303 2857 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 27 05:55:44.814757 kubelet[2857]: E0127 05:55:44.814720 2857 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" not found" Jan 27 05:55:44.828688 kubelet[2857]: I0127 05:55:44.827608 2857 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 27 05:55:44.828688 kubelet[2857]: I0127 05:55:44.827776 2857 reconciler.go:26] "Reconciler: start to sync state" Jan 27 05:55:44.833958 kubelet[2857]: I0127 05:55:44.832838 2857 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 27 05:55:44.846607 kubelet[2857]: I0127 05:55:44.844614 2857 factory.go:221] Registration of the containerd container factory successfully Jan 27 05:55:44.846607 kubelet[2857]: I0127 05:55:44.844639 2857 factory.go:221] Registration of the systemd container factory successfully Jan 27 05:55:44.846607 kubelet[2857]: I0127 05:55:44.845223 2857 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 05:55:44.849809 kubelet[2857]: I0127 05:55:44.848907 2857 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 05:55:44.849809 kubelet[2857]: I0127 05:55:44.848943 2857 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 27 05:55:44.849809 kubelet[2857]: I0127 05:55:44.848968 2857 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 27 05:55:44.849809 kubelet[2857]: I0127 05:55:44.848980 2857 kubelet.go:2382] "Starting kubelet main sync loop" Jan 27 05:55:44.849809 kubelet[2857]: E0127 05:55:44.849044 2857 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 05:55:44.949502 kubelet[2857]: E0127 05:55:44.949471 2857 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 27 05:55:44.966549 kubelet[2857]: I0127 05:55:44.966277 2857 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 27 05:55:44.966549 kubelet[2857]: I0127 05:55:44.966300 2857 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 27 05:55:44.966549 kubelet[2857]: I0127 05:55:44.966325 2857 state_mem.go:36] "Initialized new in-memory state store" Jan 27 05:55:44.966801 kubelet[2857]: I0127 05:55:44.966646 2857 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 27 05:55:44.966801 kubelet[2857]: I0127 05:55:44.966665 2857 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 27 05:55:44.966801 kubelet[2857]: I0127 05:55:44.966751 2857 policy_none.go:49] "None policy: Start" Jan 27 05:55:44.966801 kubelet[2857]: I0127 05:55:44.966768 2857 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 27 05:55:44.966801 kubelet[2857]: I0127 05:55:44.966786 2857 state_mem.go:35] "Initializing new in-memory state store" Jan 27 05:55:44.967093 kubelet[2857]: I0127 05:55:44.967010 2857 state_mem.go:75] "Updated machine memory state" Jan 27 05:55:44.976325 kubelet[2857]: I0127 05:55:44.975827 2857 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 05:55:44.976325 kubelet[2857]: I0127 05:55:44.976043 2857 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 05:55:44.976325 kubelet[2857]: I0127 05:55:44.976057 2857 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 05:55:44.976325 kubelet[2857]: I0127 05:55:44.976315 2857 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 05:55:44.984626 kubelet[2857]: E0127 05:55:44.983354 2857 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 27 05:55:45.097007 kubelet[2857]: I0127 05:55:45.096340 2857 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.106652 kubelet[2857]: I0127 05:55:45.106598 2857 kubelet_node_status.go:124] "Node was previously registered" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.108533 kubelet[2857]: I0127 05:55:45.107118 2857 kubelet_node_status.go:78] "Successfully registered node" node="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.150789 kubelet[2857]: I0127 05:55:45.150755 2857 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.153390 kubelet[2857]: I0127 05:55:45.151482 2857 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.153390 kubelet[2857]: I0127 05:55:45.152248 2857 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.159749 kubelet[2857]: W0127 05:55:45.159592 2857 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Jan 27 05:55:45.159749 kubelet[2857]: E0127 05:55:45.159664 2857 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" already exists" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.160601 kubelet[2857]: W0127 05:55:45.160539 2857 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Jan 27 05:55:45.163265 kubelet[2857]: W0127 05:55:45.163188 2857 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Jan 27 05:55:45.232479 kubelet[2857]: I0127 05:55:45.232114 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b6cc33d9cebe6713adb74370d3180a3-ca-certs\") pod \"kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"4b6cc33d9cebe6713adb74370d3180a3\") " pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.232479 kubelet[2857]: I0127 05:55:45.232206 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b6cc33d9cebe6713adb74370d3180a3-k8s-certs\") pod \"kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"4b6cc33d9cebe6713adb74370d3180a3\") " pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.232479 kubelet[2857]: I0127 05:55:45.232244 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-k8s-certs\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.232479 kubelet[2857]: I0127 05:55:45.232275 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-kubeconfig\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.232794 kubelet[2857]: I0127 05:55:45.232313 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd705fb348d77328eefb16f73cbcb849-kubeconfig\") pod \"kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"fd705fb348d77328eefb16f73cbcb849\") " pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.232794 kubelet[2857]: I0127 05:55:45.232343 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-ca-certs\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.234348 kubelet[2857]: I0127 05:55:45.234061 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-flexvolume-dir\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.234348 kubelet[2857]: I0127 05:55:45.234135 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6db618a671879346c6fb373896e7fc78-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"6db618a671879346c6fb373896e7fc78\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.234777 kubelet[2857]: I0127 05:55:45.234580 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b6cc33d9cebe6713adb74370d3180a3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" (UID: \"4b6cc33d9cebe6713adb74370d3180a3\") " pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.765038 kubelet[2857]: I0127 05:55:45.764993 2857 apiserver.go:52] "Watching apiserver" Jan 27 05:55:45.815385 kubelet[2857]: I0127 05:55:45.815244 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" podStartSLOduration=0.815220288 podStartE2EDuration="815.220288ms" podCreationTimestamp="2026-01-27 05:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 05:55:45.804326173 +0000 UTC m=+1.149327494" watchObservedRunningTime="2026-01-27 05:55:45.815220288 +0000 UTC m=+1.160221865" Jan 27 05:55:45.816476 kubelet[2857]: I0127 05:55:45.816407 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" podStartSLOduration=3.8163885669999997 podStartE2EDuration="3.816388567s" podCreationTimestamp="2026-01-27 05:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 05:55:45.816168703 +0000 UTC m=+1.161170010" watchObservedRunningTime="2026-01-27 05:55:45.816388567 +0000 UTC m=+1.161389883" Jan 27 05:55:45.828043 kubelet[2857]: I0127 05:55:45.827773 2857 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 27 05:55:45.828948 kubelet[2857]: I0127 05:55:45.828772 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" podStartSLOduration=0.828657681 podStartE2EDuration="828.657681ms" podCreationTimestamp="2026-01-27 05:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 05:55:45.827925509 +0000 UTC m=+1.172926828" watchObservedRunningTime="2026-01-27 05:55:45.828657681 +0000 UTC m=+1.173658996" Jan 27 05:55:45.936404 kubelet[2857]: I0127 05:55:45.936195 2857 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:45.945705 kubelet[2857]: W0127 05:55:45.945671 2857 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Jan 27 05:55:45.945874 kubelet[2857]: E0127 05:55:45.945745 2857 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" already exists" pod="kube-system/kube-scheduler-ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:55:48.021503 update_engine[1580]: I20260127 05:55:48.021422 1580 update_attempter.cc:509] Updating boot flags... Jan 27 05:55:48.579394 kubelet[2857]: I0127 05:55:48.578901 2857 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 27 05:55:48.580137 containerd[1613]: time="2026-01-27T05:55:48.580048652Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 27 05:55:48.580645 kubelet[2857]: I0127 05:55:48.580591 2857 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 27 05:55:49.246063 systemd[1]: Created slice kubepods-besteffort-pod4ce0948a_2c5e_4766_bc28_9232cf54912b.slice - libcontainer container kubepods-besteffort-pod4ce0948a_2c5e_4766_bc28_9232cf54912b.slice. Jan 27 05:55:49.277429 kubelet[2857]: I0127 05:55:49.277384 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4ce0948a-2c5e-4766-bc28-9232cf54912b-xtables-lock\") pod \"kube-proxy-r5zf4\" (UID: \"4ce0948a-2c5e-4766-bc28-9232cf54912b\") " pod="kube-system/kube-proxy-r5zf4" Jan 27 05:55:49.277698 kubelet[2857]: I0127 05:55:49.277669 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/4ce0948a-2c5e-4766-bc28-9232cf54912b-kube-proxy\") pod \"kube-proxy-r5zf4\" (UID: \"4ce0948a-2c5e-4766-bc28-9232cf54912b\") " pod="kube-system/kube-proxy-r5zf4" Jan 27 05:55:49.277913 kubelet[2857]: I0127 05:55:49.277734 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ce0948a-2c5e-4766-bc28-9232cf54912b-lib-modules\") pod \"kube-proxy-r5zf4\" (UID: \"4ce0948a-2c5e-4766-bc28-9232cf54912b\") " pod="kube-system/kube-proxy-r5zf4" Jan 27 05:55:49.277913 kubelet[2857]: I0127 05:55:49.277766 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzltm\" (UniqueName: \"kubernetes.io/projected/4ce0948a-2c5e-4766-bc28-9232cf54912b-kube-api-access-nzltm\") pod \"kube-proxy-r5zf4\" (UID: \"4ce0948a-2c5e-4766-bc28-9232cf54912b\") " pod="kube-system/kube-proxy-r5zf4" Jan 27 05:55:49.556123 containerd[1613]: time="2026-01-27T05:55:49.555962031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-r5zf4,Uid:4ce0948a-2c5e-4766-bc28-9232cf54912b,Namespace:kube-system,Attempt:0,}" Jan 27 05:55:49.587245 containerd[1613]: time="2026-01-27T05:55:49.587132389Z" level=info msg="connecting to shim 96a735075e9af8254fd2d0da53fa38313ed5fdd25fab297a07b2e4e509e156dd" address="unix:///run/containerd/s/5a4c20f967f10ce20279877717faa2d8ff6088bf6e6f5cf7ae11bfe8b7946b9d" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:55:49.627869 systemd[1]: Started cri-containerd-96a735075e9af8254fd2d0da53fa38313ed5fdd25fab297a07b2e4e509e156dd.scope - libcontainer container 96a735075e9af8254fd2d0da53fa38313ed5fdd25fab297a07b2e4e509e156dd. Jan 27 05:55:49.643000 audit: BPF prog-id=134 op=LOAD Jan 27 05:55:49.651351 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 27 05:55:49.651471 kernel: audit: type=1334 audit(1769493349.643:421): prog-id=134 op=LOAD Jan 27 05:55:49.643000 audit: BPF prog-id=135 op=LOAD Jan 27 05:55:49.664561 kernel: audit: type=1334 audit(1769493349.643:422): prog-id=135 op=LOAD Jan 27 05:55:49.643000 audit[2940]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.694405 kernel: audit: type=1300 audit(1769493349.643:422): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.723143 kernel: audit: type=1327 audit(1769493349.643:422): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.724281 kernel: audit: type=1334 audit(1769493349.643:423): prog-id=135 op=UNLOAD Jan 27 05:55:49.643000 audit: BPF prog-id=135 op=UNLOAD Jan 27 05:55:49.643000 audit[2940]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.759633 kernel: audit: type=1300 audit(1769493349.643:423): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.643000 audit: BPF prog-id=136 op=LOAD Jan 27 05:55:49.801431 kernel: audit: type=1327 audit(1769493349.643:423): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.801555 kernel: audit: type=1334 audit(1769493349.643:424): prog-id=136 op=LOAD Jan 27 05:55:49.801598 kernel: audit: type=1300 audit(1769493349.643:424): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit[2940]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.839552 containerd[1613]: time="2026-01-27T05:55:49.839425576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-r5zf4,Uid:4ce0948a-2c5e-4766-bc28-9232cf54912b,Namespace:kube-system,Attempt:0,} returns sandbox id \"96a735075e9af8254fd2d0da53fa38313ed5fdd25fab297a07b2e4e509e156dd\"" Jan 27 05:55:49.859842 kernel: audit: type=1327 audit(1769493349.643:424): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.643000 audit: BPF prog-id=137 op=LOAD Jan 27 05:55:49.643000 audit[2940]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.643000 audit: BPF prog-id=137 op=UNLOAD Jan 27 05:55:49.643000 audit[2940]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.643000 audit: BPF prog-id=136 op=UNLOAD Jan 27 05:55:49.643000 audit[2940]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.643000 audit: BPF prog-id=138 op=LOAD Jan 27 05:55:49.643000 audit[2940]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2929 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936613733353037356539616638323534666432643064613533666133 Jan 27 05:55:49.862878 containerd[1613]: time="2026-01-27T05:55:49.862102530Z" level=info msg="CreateContainer within sandbox \"96a735075e9af8254fd2d0da53fa38313ed5fdd25fab297a07b2e4e509e156dd\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 27 05:55:49.879829 systemd[1]: Created slice kubepods-besteffort-pod5f5a5e4a_ae26_4a99_ba5c_efb955ffb3ea.slice - libcontainer container kubepods-besteffort-pod5f5a5e4a_ae26_4a99_ba5c_efb955ffb3ea.slice. Jan 27 05:55:49.881734 kubelet[2857]: I0127 05:55:49.881609 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5f5a5e4a-ae26-4a99-ba5c-efb955ffb3ea-var-lib-calico\") pod \"tigera-operator-7dcd859c48-htwqh\" (UID: \"5f5a5e4a-ae26-4a99-ba5c-efb955ffb3ea\") " pod="tigera-operator/tigera-operator-7dcd859c48-htwqh" Jan 27 05:55:49.881734 kubelet[2857]: I0127 05:55:49.881665 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67wbx\" (UniqueName: \"kubernetes.io/projected/5f5a5e4a-ae26-4a99-ba5c-efb955ffb3ea-kube-api-access-67wbx\") pod \"tigera-operator-7dcd859c48-htwqh\" (UID: \"5f5a5e4a-ae26-4a99-ba5c-efb955ffb3ea\") " pod="tigera-operator/tigera-operator-7dcd859c48-htwqh" Jan 27 05:55:49.892771 containerd[1613]: time="2026-01-27T05:55:49.891492884Z" level=info msg="Container aec30a776ab9605fc52feb71c73ba9323c72f240cb0fd6eb39176fd5b249eeb4: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:55:49.904776 containerd[1613]: time="2026-01-27T05:55:49.904715195Z" level=info msg="CreateContainer within sandbox \"96a735075e9af8254fd2d0da53fa38313ed5fdd25fab297a07b2e4e509e156dd\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"aec30a776ab9605fc52feb71c73ba9323c72f240cb0fd6eb39176fd5b249eeb4\"" Jan 27 05:55:49.906393 containerd[1613]: time="2026-01-27T05:55:49.905896185Z" level=info msg="StartContainer for \"aec30a776ab9605fc52feb71c73ba9323c72f240cb0fd6eb39176fd5b249eeb4\"" Jan 27 05:55:49.908240 containerd[1613]: time="2026-01-27T05:55:49.908197242Z" level=info msg="connecting to shim aec30a776ab9605fc52feb71c73ba9323c72f240cb0fd6eb39176fd5b249eeb4" address="unix:///run/containerd/s/5a4c20f967f10ce20279877717faa2d8ff6088bf6e6f5cf7ae11bfe8b7946b9d" protocol=ttrpc version=3 Jan 27 05:55:49.935705 systemd[1]: Started cri-containerd-aec30a776ab9605fc52feb71c73ba9323c72f240cb0fd6eb39176fd5b249eeb4.scope - libcontainer container aec30a776ab9605fc52feb71c73ba9323c72f240cb0fd6eb39176fd5b249eeb4. Jan 27 05:55:49.991000 audit: BPF prog-id=139 op=LOAD Jan 27 05:55:49.991000 audit[2966]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2929 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165633330613737366162393630356663353266656237316337336261 Jan 27 05:55:49.991000 audit: BPF prog-id=140 op=LOAD Jan 27 05:55:49.991000 audit[2966]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2929 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165633330613737366162393630356663353266656237316337336261 Jan 27 05:55:49.991000 audit: BPF prog-id=140 op=UNLOAD Jan 27 05:55:49.991000 audit[2966]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2929 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165633330613737366162393630356663353266656237316337336261 Jan 27 05:55:49.991000 audit: BPF prog-id=139 op=UNLOAD Jan 27 05:55:49.991000 audit[2966]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2929 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165633330613737366162393630356663353266656237316337336261 Jan 27 05:55:49.991000 audit: BPF prog-id=141 op=LOAD Jan 27 05:55:49.991000 audit[2966]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2929 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:49.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165633330613737366162393630356663353266656237316337336261 Jan 27 05:55:50.037250 containerd[1613]: time="2026-01-27T05:55:50.037115616Z" level=info msg="StartContainer for \"aec30a776ab9605fc52feb71c73ba9323c72f240cb0fd6eb39176fd5b249eeb4\" returns successfully" Jan 27 05:55:50.187310 containerd[1613]: time="2026-01-27T05:55:50.185609314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-htwqh,Uid:5f5a5e4a-ae26-4a99-ba5c-efb955ffb3ea,Namespace:tigera-operator,Attempt:0,}" Jan 27 05:55:50.204000 audit[3033]: NETFILTER_CFG table=mangle:54 family=10 entries=1 op=nft_register_chain pid=3033 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.204000 audit[3033]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdbbeb68e0 a2=0 a3=7ffdbbeb68cc items=0 ppid=2979 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.204000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 27 05:55:50.214000 audit[3038]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_chain pid=3038 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.214000 audit[3038]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff48fb6e30 a2=0 a3=7fff48fb6e1c items=0 ppid=2979 pid=3038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.214000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 27 05:55:50.220540 containerd[1613]: time="2026-01-27T05:55:50.220419134Z" level=info msg="connecting to shim f3b1f8b59204cd567f1c5bd3e8dc4da5173e28f29cfbd2a7ebb2d7c8959da358" address="unix:///run/containerd/s/d798c532535ba816dca01fc4e868ca1cf472ee2e28bc0808bd7964f46316e9dc" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:55:50.220000 audit[3034]: NETFILTER_CFG table=mangle:56 family=2 entries=1 op=nft_register_chain pid=3034 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.220000 audit[3034]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff2db09110 a2=0 a3=c61ef5c349d4b766 items=0 ppid=2979 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.220000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 27 05:55:50.229000 audit[3047]: NETFILTER_CFG table=nat:57 family=2 entries=1 op=nft_register_chain pid=3047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.229000 audit[3047]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee38d94f0 a2=0 a3=7ffee38d94dc items=0 ppid=2979 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.229000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 27 05:55:50.230000 audit[3046]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=3046 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.230000 audit[3046]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd09671260 a2=0 a3=7ffd0967124c items=0 ppid=2979 pid=3046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.230000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 27 05:55:50.233000 audit[3057]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.233000 audit[3057]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd80977dc0 a2=0 a3=7ffd80977dac items=0 ppid=2979 pid=3057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.233000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 27 05:55:50.261639 systemd[1]: Started cri-containerd-f3b1f8b59204cd567f1c5bd3e8dc4da5173e28f29cfbd2a7ebb2d7c8959da358.scope - libcontainer container f3b1f8b59204cd567f1c5bd3e8dc4da5173e28f29cfbd2a7ebb2d7c8959da358. Jan 27 05:55:50.276000 audit: BPF prog-id=142 op=LOAD Jan 27 05:55:50.276000 audit: BPF prog-id=143 op=LOAD Jan 27 05:55:50.276000 audit[3059]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3045 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633623166386235393230346364353637663163356264336538646334 Jan 27 05:55:50.276000 audit: BPF prog-id=143 op=UNLOAD Jan 27 05:55:50.276000 audit[3059]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3045 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633623166386235393230346364353637663163356264336538646334 Jan 27 05:55:50.276000 audit: BPF prog-id=144 op=LOAD Jan 27 05:55:50.276000 audit[3059]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3045 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633623166386235393230346364353637663163356264336538646334 Jan 27 05:55:50.277000 audit: BPF prog-id=145 op=LOAD Jan 27 05:55:50.277000 audit[3059]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3045 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633623166386235393230346364353637663163356264336538646334 Jan 27 05:55:50.277000 audit: BPF prog-id=145 op=UNLOAD Jan 27 05:55:50.277000 audit[3059]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3045 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633623166386235393230346364353637663163356264336538646334 Jan 27 05:55:50.277000 audit: BPF prog-id=144 op=UNLOAD Jan 27 05:55:50.277000 audit[3059]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3045 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633623166386235393230346364353637663163356264336538646334 Jan 27 05:55:50.277000 audit: BPF prog-id=146 op=LOAD Jan 27 05:55:50.277000 audit[3059]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3045 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633623166386235393230346364353637663163356264336538646334 Jan 27 05:55:50.308000 audit[3077]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3077 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.308000 audit[3077]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fffa28778d0 a2=0 a3=7fffa28778bc items=0 ppid=2979 pid=3077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.308000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 27 05:55:50.317000 audit[3080]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.317000 audit[3080]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd21f0c6a0 a2=0 a3=7ffd21f0c68c items=0 ppid=2979 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.317000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 27 05:55:50.327000 audit[3089]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.327000 audit[3089]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff020d4140 a2=0 a3=7fff020d412c items=0 ppid=2979 pid=3089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.327000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 27 05:55:50.332535 containerd[1613]: time="2026-01-27T05:55:50.332486727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-htwqh,Uid:5f5a5e4a-ae26-4a99-ba5c-efb955ffb3ea,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f3b1f8b59204cd567f1c5bd3e8dc4da5173e28f29cfbd2a7ebb2d7c8959da358\"" Jan 27 05:55:50.335000 audit[3090]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3090 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.335000 audit[3090]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffceb92f810 a2=0 a3=7ffceb92f7fc items=0 ppid=2979 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.335000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 27 05:55:50.338035 containerd[1613]: time="2026-01-27T05:55:50.337514106Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 27 05:55:50.343000 audit[3092]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3092 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.343000 audit[3092]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe4f451e00 a2=0 a3=7ffe4f451dec items=0 ppid=2979 pid=3092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.343000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 27 05:55:50.346000 audit[3093]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.346000 audit[3093]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc47147c90 a2=0 a3=7ffc47147c7c items=0 ppid=2979 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.346000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 27 05:55:50.351000 audit[3095]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3095 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.351000 audit[3095]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc0d43fe90 a2=0 a3=7ffc0d43fe7c items=0 ppid=2979 pid=3095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.351000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 27 05:55:50.356000 audit[3098]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.356000 audit[3098]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc0cd77690 a2=0 a3=7ffc0cd7767c items=0 ppid=2979 pid=3098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.356000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 27 05:55:50.358000 audit[3099]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.358000 audit[3099]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe18f90100 a2=0 a3=7ffe18f900ec items=0 ppid=2979 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.358000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 27 05:55:50.363000 audit[3101]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.363000 audit[3101]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff37aa5420 a2=0 a3=7fff37aa540c items=0 ppid=2979 pid=3101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.363000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 27 05:55:50.365000 audit[3102]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3102 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.365000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff18b666d0 a2=0 a3=7fff18b666bc items=0 ppid=2979 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.365000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 27 05:55:50.371000 audit[3104]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3104 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.371000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe43805300 a2=0 a3=7ffe438052ec items=0 ppid=2979 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.371000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 27 05:55:50.377000 audit[3107]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.377000 audit[3107]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcc7dd45e0 a2=0 a3=7ffcc7dd45cc items=0 ppid=2979 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.377000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 27 05:55:50.382000 audit[3110]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3110 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.382000 audit[3110]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffffafcbaa0 a2=0 a3=7ffffafcba8c items=0 ppid=2979 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.382000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 27 05:55:50.384000 audit[3111]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.384000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe5cb30d00 a2=0 a3=7ffe5cb30cec items=0 ppid=2979 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.384000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 27 05:55:50.388000 audit[3113]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.388000 audit[3113]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffdd390f0c0 a2=0 a3=7ffdd390f0ac items=0 ppid=2979 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.388000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 05:55:50.395000 audit[3116]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.395000 audit[3116]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe17109990 a2=0 a3=7ffe1710997c items=0 ppid=2979 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.395000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 05:55:50.397000 audit[3117]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.397000 audit[3117]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffffe7ec040 a2=0 a3=7ffffe7ec02c items=0 ppid=2979 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.397000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 27 05:55:50.413000 audit[3119]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 05:55:50.413000 audit[3119]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7fffb0fe7b30 a2=0 a3=7fffb0fe7b1c items=0 ppid=2979 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.413000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 27 05:55:50.449000 audit[3125]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3125 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:55:50.449000 audit[3125]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffffc242370 a2=0 a3=7ffffc24235c items=0 ppid=2979 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.449000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:55:50.458000 audit[3125]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3125 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:55:50.458000 audit[3125]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffffc242370 a2=0 a3=7ffffc24235c items=0 ppid=2979 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.458000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:55:50.460000 audit[3130]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.460000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe4216d290 a2=0 a3=7ffe4216d27c items=0 ppid=2979 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.460000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 27 05:55:50.466000 audit[3132]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3132 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.466000 audit[3132]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffc67bb74c0 a2=0 a3=7ffc67bb74ac items=0 ppid=2979 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.466000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 27 05:55:50.472000 audit[3135]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.472000 audit[3135]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe72d96540 a2=0 a3=7ffe72d9652c items=0 ppid=2979 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.472000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 27 05:55:50.474000 audit[3136]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3136 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.474000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc00787720 a2=0 a3=7ffc0078770c items=0 ppid=2979 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.474000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 27 05:55:50.479000 audit[3138]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3138 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.479000 audit[3138]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe6fbd12e0 a2=0 a3=7ffe6fbd12cc items=0 ppid=2979 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.479000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 27 05:55:50.481000 audit[3139]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3139 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.481000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe6d4618c0 a2=0 a3=7ffe6d4618ac items=0 ppid=2979 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.481000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 27 05:55:50.486000 audit[3141]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3141 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.486000 audit[3141]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc1050c2a0 a2=0 a3=7ffc1050c28c items=0 ppid=2979 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.486000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 27 05:55:50.494000 audit[3144]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3144 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.494000 audit[3144]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7fff6a295570 a2=0 a3=7fff6a29555c items=0 ppid=2979 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.494000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 27 05:55:50.496000 audit[3145]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.496000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4252d0e0 a2=0 a3=7fff4252d0cc items=0 ppid=2979 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.496000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 27 05:55:50.500000 audit[3147]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3147 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.500000 audit[3147]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe713c3040 a2=0 a3=7ffe713c302c items=0 ppid=2979 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.500000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 27 05:55:50.502000 audit[3148]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3148 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.502000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd6b0c7280 a2=0 a3=7ffd6b0c726c items=0 ppid=2979 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.502000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 27 05:55:50.506000 audit[3150]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.506000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffee10cd80 a2=0 a3=7fffee10cd6c items=0 ppid=2979 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.506000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 27 05:55:50.512000 audit[3153]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3153 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.512000 audit[3153]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffa9b2d810 a2=0 a3=7fffa9b2d7fc items=0 ppid=2979 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.512000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 27 05:55:50.520000 audit[3156]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.520000 audit[3156]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffda8c6b4f0 a2=0 a3=7ffda8c6b4dc items=0 ppid=2979 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.520000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 27 05:55:50.523000 audit[3157]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.523000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff7724f380 a2=0 a3=7fff7724f36c items=0 ppid=2979 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.523000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 27 05:55:50.527000 audit[3159]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3159 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.527000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc496bb540 a2=0 a3=7ffc496bb52c items=0 ppid=2979 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.527000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 05:55:50.533000 audit[3162]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3162 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.533000 audit[3162]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd00703ec0 a2=0 a3=7ffd00703eac items=0 ppid=2979 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.533000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 05:55:50.535000 audit[3163]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3163 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.535000 audit[3163]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeec0ad880 a2=0 a3=7ffeec0ad86c items=0 ppid=2979 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.535000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 27 05:55:50.539000 audit[3165]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.539000 audit[3165]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffc4d9888e0 a2=0 a3=7ffc4d9888cc items=0 ppid=2979 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.539000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 27 05:55:50.540000 audit[3166]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3166 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.540000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff80ae8790 a2=0 a3=7fff80ae877c items=0 ppid=2979 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.540000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 27 05:55:50.545000 audit[3168]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3168 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.545000 audit[3168]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffcf28e2780 a2=0 a3=7ffcf28e276c items=0 ppid=2979 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.545000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 05:55:50.552000 audit[3171]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 05:55:50.552000 audit[3171]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffe88ad79b0 a2=0 a3=7ffe88ad799c items=0 ppid=2979 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.552000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 05:55:50.559000 audit[3173]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3173 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 27 05:55:50.559000 audit[3173]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7fff95478420 a2=0 a3=7fff9547840c items=0 ppid=2979 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.559000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:55:50.560000 audit[3173]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3173 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 27 05:55:50.560000 audit[3173]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7fff95478420 a2=0 a3=7fff9547840c items=0 ppid=2979 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:50.560000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:55:50.979857 kubelet[2857]: I0127 05:55:50.979601 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-r5zf4" podStartSLOduration=1.979577397 podStartE2EDuration="1.979577397s" podCreationTimestamp="2026-01-27 05:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 05:55:50.979294108 +0000 UTC m=+6.324295431" watchObservedRunningTime="2026-01-27 05:55:50.979577397 +0000 UTC m=+6.324578717" Jan 27 05:55:51.577428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2584842306.mount: Deactivated successfully. Jan 27 05:55:52.433728 containerd[1613]: time="2026-01-27T05:55:52.433661902Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:52.435035 containerd[1613]: time="2026-01-27T05:55:52.434871245Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 27 05:55:52.436154 containerd[1613]: time="2026-01-27T05:55:52.436066774Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:52.438742 containerd[1613]: time="2026-01-27T05:55:52.438673670Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:55:52.439766 containerd[1613]: time="2026-01-27T05:55:52.439607789Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.102057081s" Jan 27 05:55:52.439766 containerd[1613]: time="2026-01-27T05:55:52.439649689Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 27 05:55:52.442794 containerd[1613]: time="2026-01-27T05:55:52.442754649Z" level=info msg="CreateContainer within sandbox \"f3b1f8b59204cd567f1c5bd3e8dc4da5173e28f29cfbd2a7ebb2d7c8959da358\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 27 05:55:52.452577 containerd[1613]: time="2026-01-27T05:55:52.452526721Z" level=info msg="Container 5c81c2d840609b00c24ecd549f8b4eb34516d466990db3bb54873e9b1eb0ebd5: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:55:52.463486 containerd[1613]: time="2026-01-27T05:55:52.463442957Z" level=info msg="CreateContainer within sandbox \"f3b1f8b59204cd567f1c5bd3e8dc4da5173e28f29cfbd2a7ebb2d7c8959da358\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5c81c2d840609b00c24ecd549f8b4eb34516d466990db3bb54873e9b1eb0ebd5\"" Jan 27 05:55:52.464911 containerd[1613]: time="2026-01-27T05:55:52.464147482Z" level=info msg="StartContainer for \"5c81c2d840609b00c24ecd549f8b4eb34516d466990db3bb54873e9b1eb0ebd5\"" Jan 27 05:55:52.466113 containerd[1613]: time="2026-01-27T05:55:52.466058602Z" level=info msg="connecting to shim 5c81c2d840609b00c24ecd549f8b4eb34516d466990db3bb54873e9b1eb0ebd5" address="unix:///run/containerd/s/d798c532535ba816dca01fc4e868ca1cf472ee2e28bc0808bd7964f46316e9dc" protocol=ttrpc version=3 Jan 27 05:55:52.497581 systemd[1]: Started cri-containerd-5c81c2d840609b00c24ecd549f8b4eb34516d466990db3bb54873e9b1eb0ebd5.scope - libcontainer container 5c81c2d840609b00c24ecd549f8b4eb34516d466990db3bb54873e9b1eb0ebd5. Jan 27 05:55:52.515000 audit: BPF prog-id=147 op=LOAD Jan 27 05:55:52.516000 audit: BPF prog-id=148 op=LOAD Jan 27 05:55:52.516000 audit[3185]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3045 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:52.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563383163326438343036303962303063323465636435343966386234 Jan 27 05:55:52.516000 audit: BPF prog-id=148 op=UNLOAD Jan 27 05:55:52.516000 audit[3185]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3045 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:52.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563383163326438343036303962303063323465636435343966386234 Jan 27 05:55:52.517000 audit: BPF prog-id=149 op=LOAD Jan 27 05:55:52.517000 audit[3185]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3045 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:52.517000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563383163326438343036303962303063323465636435343966386234 Jan 27 05:55:52.517000 audit: BPF prog-id=150 op=LOAD Jan 27 05:55:52.517000 audit[3185]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3045 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:52.517000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563383163326438343036303962303063323465636435343966386234 Jan 27 05:55:52.517000 audit: BPF prog-id=150 op=UNLOAD Jan 27 05:55:52.517000 audit[3185]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3045 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:52.517000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563383163326438343036303962303063323465636435343966386234 Jan 27 05:55:52.518000 audit: BPF prog-id=149 op=UNLOAD Jan 27 05:55:52.518000 audit[3185]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3045 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:52.518000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563383163326438343036303962303063323465636435343966386234 Jan 27 05:55:52.518000 audit: BPF prog-id=151 op=LOAD Jan 27 05:55:52.518000 audit[3185]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3045 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:55:52.518000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563383163326438343036303962303063323465636435343966386234 Jan 27 05:55:52.552476 containerd[1613]: time="2026-01-27T05:55:52.552354590Z" level=info msg="StartContainer for \"5c81c2d840609b00c24ecd549f8b4eb34516d466990db3bb54873e9b1eb0ebd5\" returns successfully" Jan 27 05:55:53.328306 kubelet[2857]: I0127 05:55:53.327593 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-htwqh" podStartSLOduration=2.221229848 podStartE2EDuration="4.327568854s" podCreationTimestamp="2026-01-27 05:55:49 +0000 UTC" firstStartedPulling="2026-01-27 05:55:50.334440092 +0000 UTC m=+5.679441403" lastFinishedPulling="2026-01-27 05:55:52.440779114 +0000 UTC m=+7.785780409" observedRunningTime="2026-01-27 05:55:52.972221938 +0000 UTC m=+8.317223259" watchObservedRunningTime="2026-01-27 05:55:53.327568854 +0000 UTC m=+8.672570171" Jan 27 05:55:59.686000 audit[1938]: USER_END pid=1938 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:59.688206 sudo[1938]: pam_unix(sudo:session): session closed for user root Jan 27 05:55:59.693784 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 27 05:55:59.693898 kernel: audit: type=1106 audit(1769493359.686:501): pid=1938 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:59.740240 kernel: audit: type=1104 audit(1769493359.686:502): pid=1938 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:59.686000 audit[1938]: CRED_DISP pid=1938 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 05:55:59.740568 sshd[1937]: Connection closed by 4.153.228.146 port 53978 Jan 27 05:55:59.741601 sshd-session[1933]: pam_unix(sshd:session): session closed for user core Jan 27 05:55:59.744000 audit[1933]: USER_END pid=1933 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:59.784394 kernel: audit: type=1106 audit(1769493359.744:503): pid=1933 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:59.787321 systemd[1]: sshd@8-10.128.0.23:22-4.153.228.146:53978.service: Deactivated successfully. Jan 27 05:55:59.744000 audit[1933]: CRED_DISP pid=1933 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:59.794635 systemd[1]: session-10.scope: Deactivated successfully. Jan 27 05:55:59.795191 systemd[1]: session-10.scope: Consumed 6.932s CPU time, 228.4M memory peak. Jan 27 05:55:59.803260 systemd-logind[1576]: Session 10 logged out. Waiting for processes to exit. Jan 27 05:55:59.808723 systemd-logind[1576]: Removed session 10. Jan 27 05:55:59.813389 kernel: audit: type=1104 audit(1769493359.744:504): pid=1933 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:55:59.787000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.128.0.23:22-4.153.228.146:53978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:55:59.841448 kernel: audit: type=1131 audit(1769493359.787:505): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.128.0.23:22-4.153.228.146:53978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:56:01.056000 audit[3271]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3271 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:01.075981 kernel: audit: type=1325 audit(1769493361.056:506): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3271 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:01.056000 audit[3271]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc6872ba90 a2=0 a3=7ffc6872ba7c items=0 ppid=2979 pid=3271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:01.113406 kernel: audit: type=1300 audit(1769493361.056:506): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc6872ba90 a2=0 a3=7ffc6872ba7c items=0 ppid=2979 pid=3271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:01.056000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:01.078000 audit[3271]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3271 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:01.149216 kernel: audit: type=1327 audit(1769493361.056:506): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:01.149340 kernel: audit: type=1325 audit(1769493361.078:507): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3271 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:01.149409 kernel: audit: type=1300 audit(1769493361.078:507): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc6872ba90 a2=0 a3=0 items=0 ppid=2979 pid=3271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:01.078000 audit[3271]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc6872ba90 a2=0 a3=0 items=0 ppid=2979 pid=3271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:01.078000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:01.206000 audit[3273]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3273 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:01.206000 audit[3273]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fff8a749c60 a2=0 a3=7fff8a749c4c items=0 ppid=2979 pid=3273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:01.206000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:01.211000 audit[3273]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3273 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:01.211000 audit[3273]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff8a749c60 a2=0 a3=0 items=0 ppid=2979 pid=3273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:01.211000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.290443 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 27 05:56:05.290630 kernel: audit: type=1325 audit(1769493365.268:510): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3275 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:05.268000 audit[3275]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3275 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:05.268000 audit[3275]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffdb59241e0 a2=0 a3=7ffdb59241cc items=0 ppid=2979 pid=3275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:05.341767 kernel: audit: type=1300 audit(1769493365.268:510): arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffdb59241e0 a2=0 a3=7ffdb59241cc items=0 ppid=2979 pid=3275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:05.342089 kernel: audit: type=1327 audit(1769493365.268:510): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.268000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.358286 kernel: audit: type=1325 audit(1769493365.308:511): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3275 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:05.308000 audit[3275]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3275 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:05.308000 audit[3275]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdb59241e0 a2=0 a3=0 items=0 ppid=2979 pid=3275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:05.406495 kernel: audit: type=1300 audit(1769493365.308:511): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdb59241e0 a2=0 a3=0 items=0 ppid=2979 pid=3275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:05.406649 kernel: audit: type=1327 audit(1769493365.308:511): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.308000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.409000 audit[3277]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3277 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:05.460857 kernel: audit: type=1325 audit(1769493365.409:512): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3277 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:05.461001 kernel: audit: type=1300 audit(1769493365.409:512): arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffd5f00fff0 a2=0 a3=7ffd5f00ffdc items=0 ppid=2979 pid=3277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:05.409000 audit[3277]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffd5f00fff0 a2=0 a3=7ffd5f00ffdc items=0 ppid=2979 pid=3277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:05.476818 kernel: audit: type=1327 audit(1769493365.409:512): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.409000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.476000 audit[3277]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3277 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:05.476000 audit[3277]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd5f00fff0 a2=0 a3=0 items=0 ppid=2979 pid=3277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:05.476000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:05.495420 kernel: audit: type=1325 audit(1769493365.476:513): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3277 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:06.510000 audit[3279]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3279 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:06.510000 audit[3279]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff43aa4630 a2=0 a3=7fff43aa461c items=0 ppid=2979 pid=3279 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:06.510000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:06.525000 audit[3279]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3279 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:06.525000 audit[3279]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff43aa4630 a2=0 a3=0 items=0 ppid=2979 pid=3279 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:06.525000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:07.377477 kubelet[2857]: I0127 05:56:07.377196 2857 status_manager.go:890] "Failed to get status for pod" podUID="41dd3ec0-1adc-4e72-8f10-6697a55cd87e" pod="calico-system/calico-typha-6b579b76c6-kwq9s" err="pods \"calico-typha-6b579b76c6-kwq9s\" is forbidden: User \"system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object" Jan 27 05:56:07.378284 kubelet[2857]: W0127 05:56:07.377353 2857 reflector.go:569] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object Jan 27 05:56:07.381117 kubelet[2857]: E0127 05:56:07.377690 2857 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"typha-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object" logger="UnhandledError" Jan 27 05:56:07.381117 kubelet[2857]: W0127 05:56:07.379135 2857 reflector.go:569] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object Jan 27 05:56:07.381117 kubelet[2857]: E0127 05:56:07.379608 2857 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"tigera-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object" logger="UnhandledError" Jan 27 05:56:07.381117 kubelet[2857]: W0127 05:56:07.379458 2857 reflector.go:569] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object Jan 27 05:56:07.382981 kubelet[2857]: E0127 05:56:07.380461 2857 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object" logger="UnhandledError" Jan 27 05:56:07.381911 systemd[1]: Created slice kubepods-besteffort-pod41dd3ec0_1adc_4e72_8f10_6697a55cd87e.slice - libcontainer container kubepods-besteffort-pod41dd3ec0_1adc_4e72_8f10_6697a55cd87e.slice. Jan 27 05:56:07.360000 audit[3281]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3281 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:07.360000 audit[3281]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc67600d00 a2=0 a3=7ffc67600cec items=0 ppid=2979 pid=3281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:07.360000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:07.400000 kubelet[2857]: I0127 05:56:07.399881 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-tigera-ca-bundle\") pod \"calico-typha-6b579b76c6-kwq9s\" (UID: \"41dd3ec0-1adc-4e72-8f10-6697a55cd87e\") " pod="calico-system/calico-typha-6b579b76c6-kwq9s" Jan 27 05:56:07.400000 kubelet[2857]: I0127 05:56:07.399917 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-typha-certs\") pod \"calico-typha-6b579b76c6-kwq9s\" (UID: \"41dd3ec0-1adc-4e72-8f10-6697a55cd87e\") " pod="calico-system/calico-typha-6b579b76c6-kwq9s" Jan 27 05:56:07.400000 kubelet[2857]: I0127 05:56:07.399936 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt8df\" (UniqueName: \"kubernetes.io/projected/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-kube-api-access-nt8df\") pod \"calico-typha-6b579b76c6-kwq9s\" (UID: \"41dd3ec0-1adc-4e72-8f10-6697a55cd87e\") " pod="calico-system/calico-typha-6b579b76c6-kwq9s" Jan 27 05:56:07.399000 audit[3281]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3281 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:07.399000 audit[3281]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc67600d00 a2=0 a3=0 items=0 ppid=2979 pid=3281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:07.399000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:07.470138 kubelet[2857]: W0127 05:56:07.470089 2857 reflector.go:569] object-"calico-system"/"node-certs": failed to list *v1.Secret: secrets "node-certs" is forbidden: User "system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object Jan 27 05:56:07.470338 kubelet[2857]: E0127 05:56:07.470141 2857 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"node-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-certs\" is forbidden: User \"system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object" logger="UnhandledError" Jan 27 05:56:07.476305 kubelet[2857]: W0127 05:56:07.476267 2857 reflector.go:569] object-"calico-system"/"cni-config": failed to list *v1.ConfigMap: configmaps "cni-config" is forbidden: User "system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object Jan 27 05:56:07.476715 kubelet[2857]: E0127 05:56:07.476319 2857 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"cni-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-config\" is forbidden: User \"system:node:ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' and this object" logger="UnhandledError" Jan 27 05:56:07.480555 systemd[1]: Created slice kubepods-besteffort-podb54bb9cd_85af_43b8_82d9_1d8d2e73d3e4.slice - libcontainer container kubepods-besteffort-podb54bb9cd_85af_43b8_82d9_1d8d2e73d3e4.slice. Jan 27 05:56:07.500665 kubelet[2857]: I0127 05:56:07.500629 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt44t\" (UniqueName: \"kubernetes.io/projected/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-kube-api-access-nt44t\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503346 kubelet[2857]: I0127 05:56:07.502439 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-tigera-ca-bundle\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503346 kubelet[2857]: I0127 05:56:07.502483 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-var-run-calico\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503346 kubelet[2857]: I0127 05:56:07.502553 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-policysync\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503346 kubelet[2857]: I0127 05:56:07.502599 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-xtables-lock\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503346 kubelet[2857]: I0127 05:56:07.502629 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-var-lib-calico\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503684 kubelet[2857]: I0127 05:56:07.502671 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-cni-log-dir\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503684 kubelet[2857]: I0127 05:56:07.502701 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-cni-net-dir\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503684 kubelet[2857]: I0127 05:56:07.502731 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-cni-bin-dir\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503684 kubelet[2857]: I0127 05:56:07.502760 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-flexvol-driver-host\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503684 kubelet[2857]: I0127 05:56:07.502788 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-lib-modules\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.503939 kubelet[2857]: I0127 05:56:07.502821 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-node-certs\") pod \"calico-node-vk4rp\" (UID: \"b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4\") " pod="calico-system/calico-node-vk4rp" Jan 27 05:56:07.582747 kubelet[2857]: E0127 05:56:07.582681 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:07.677336 kubelet[2857]: E0127 05:56:07.675728 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.677781 kubelet[2857]: W0127 05:56:07.677578 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.677781 kubelet[2857]: E0127 05:56:07.677667 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.678690 kubelet[2857]: E0127 05:56:07.678518 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.678690 kubelet[2857]: W0127 05:56:07.678539 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.678690 kubelet[2857]: E0127 05:56:07.678562 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.679245 kubelet[2857]: E0127 05:56:07.679131 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.679245 kubelet[2857]: W0127 05:56:07.679150 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.679245 kubelet[2857]: E0127 05:56:07.679170 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.679871 kubelet[2857]: E0127 05:56:07.679754 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.679871 kubelet[2857]: W0127 05:56:07.679771 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.679871 kubelet[2857]: E0127 05:56:07.679787 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.680497 kubelet[2857]: E0127 05:56:07.680381 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.680497 kubelet[2857]: W0127 05:56:07.680408 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.680497 kubelet[2857]: E0127 05:56:07.680426 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.681068 kubelet[2857]: E0127 05:56:07.680945 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.681068 kubelet[2857]: W0127 05:56:07.680963 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.681068 kubelet[2857]: E0127 05:56:07.680981 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.681727 kubelet[2857]: E0127 05:56:07.681616 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.681727 kubelet[2857]: W0127 05:56:07.681637 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.681727 kubelet[2857]: E0127 05:56:07.681655 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.682264 kubelet[2857]: E0127 05:56:07.682154 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.682264 kubelet[2857]: W0127 05:56:07.682171 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.682264 kubelet[2857]: E0127 05:56:07.682188 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.682878 kubelet[2857]: E0127 05:56:07.682781 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.682878 kubelet[2857]: W0127 05:56:07.682798 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.682878 kubelet[2857]: E0127 05:56:07.682815 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.683397 kubelet[2857]: E0127 05:56:07.683340 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.683595 kubelet[2857]: W0127 05:56:07.683505 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.683595 kubelet[2857]: E0127 05:56:07.683532 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.684099 kubelet[2857]: E0127 05:56:07.683999 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.684099 kubelet[2857]: W0127 05:56:07.684016 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.684099 kubelet[2857]: E0127 05:56:07.684033 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.684708 kubelet[2857]: E0127 05:56:07.684592 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.684708 kubelet[2857]: W0127 05:56:07.684609 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.684708 kubelet[2857]: E0127 05:56:07.684627 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.685247 kubelet[2857]: E0127 05:56:07.685153 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.685247 kubelet[2857]: W0127 05:56:07.685170 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.685247 kubelet[2857]: E0127 05:56:07.685187 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.685849 kubelet[2857]: E0127 05:56:07.685751 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.685849 kubelet[2857]: W0127 05:56:07.685770 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.685849 kubelet[2857]: E0127 05:56:07.685787 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.686420 kubelet[2857]: E0127 05:56:07.686286 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.686420 kubelet[2857]: W0127 05:56:07.686303 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.686420 kubelet[2857]: E0127 05:56:07.686320 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.686983 kubelet[2857]: E0127 05:56:07.686867 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.686983 kubelet[2857]: W0127 05:56:07.686885 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.686983 kubelet[2857]: E0127 05:56:07.686902 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.687554 kubelet[2857]: E0127 05:56:07.687457 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.687554 kubelet[2857]: W0127 05:56:07.687475 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.687554 kubelet[2857]: E0127 05:56:07.687492 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.688101 kubelet[2857]: E0127 05:56:07.688006 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.688101 kubelet[2857]: W0127 05:56:07.688026 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.688101 kubelet[2857]: E0127 05:56:07.688043 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.688640 kubelet[2857]: E0127 05:56:07.688541 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.688640 kubelet[2857]: W0127 05:56:07.688559 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.688640 kubelet[2857]: E0127 05:56:07.688575 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.689225 kubelet[2857]: E0127 05:56:07.689070 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.689225 kubelet[2857]: W0127 05:56:07.689088 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.689225 kubelet[2857]: E0127 05:56:07.689108 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.705208 kubelet[2857]: E0127 05:56:07.705132 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.705208 kubelet[2857]: W0127 05:56:07.705156 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.705208 kubelet[2857]: E0127 05:56:07.705178 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.705730 kubelet[2857]: I0127 05:56:07.705501 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8f740da6-d731-4b30-bf8e-ada1ccd8b61b-registration-dir\") pod \"csi-node-driver-s7bgd\" (UID: \"8f740da6-d731-4b30-bf8e-ada1ccd8b61b\") " pod="calico-system/csi-node-driver-s7bgd" Jan 27 05:56:07.706162 kubelet[2857]: E0127 05:56:07.706112 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.706162 kubelet[2857]: W0127 05:56:07.706134 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.706448 kubelet[2857]: E0127 05:56:07.706327 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.706953 kubelet[2857]: E0127 05:56:07.706759 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.706953 kubelet[2857]: W0127 05:56:07.706783 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.706953 kubelet[2857]: E0127 05:56:07.706805 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.707410 kubelet[2857]: I0127 05:56:07.707352 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8f740da6-d731-4b30-bf8e-ada1ccd8b61b-socket-dir\") pod \"csi-node-driver-s7bgd\" (UID: \"8f740da6-d731-4b30-bf8e-ada1ccd8b61b\") " pod="calico-system/csi-node-driver-s7bgd" Jan 27 05:56:07.707675 kubelet[2857]: E0127 05:56:07.707619 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.707675 kubelet[2857]: W0127 05:56:07.707636 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.707675 kubelet[2857]: E0127 05:56:07.707654 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.708385 kubelet[2857]: E0127 05:56:07.708314 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.708385 kubelet[2857]: W0127 05:56:07.708335 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.708743 kubelet[2857]: E0127 05:56:07.708574 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.709062 kubelet[2857]: E0127 05:56:07.709021 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.709062 kubelet[2857]: W0127 05:56:07.709040 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.709401 kubelet[2857]: E0127 05:56:07.709259 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.710542 kubelet[2857]: E0127 05:56:07.710522 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.711624 kubelet[2857]: W0127 05:56:07.711435 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.711624 kubelet[2857]: E0127 05:56:07.711464 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.711624 kubelet[2857]: I0127 05:56:07.711496 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8f740da6-d731-4b30-bf8e-ada1ccd8b61b-varrun\") pod \"csi-node-driver-s7bgd\" (UID: \"8f740da6-d731-4b30-bf8e-ada1ccd8b61b\") " pod="calico-system/csi-node-driver-s7bgd" Jan 27 05:56:07.712038 kubelet[2857]: E0127 05:56:07.712018 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.712271 kubelet[2857]: W0127 05:56:07.712250 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.712517 kubelet[2857]: E0127 05:56:07.712357 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.712745 kubelet[2857]: I0127 05:56:07.712721 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f740da6-d731-4b30-bf8e-ada1ccd8b61b-kubelet-dir\") pod \"csi-node-driver-s7bgd\" (UID: \"8f740da6-d731-4b30-bf8e-ada1ccd8b61b\") " pod="calico-system/csi-node-driver-s7bgd" Jan 27 05:56:07.714282 kubelet[2857]: E0127 05:56:07.714260 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.714716 kubelet[2857]: W0127 05:56:07.714462 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.714716 kubelet[2857]: E0127 05:56:07.714493 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.714716 kubelet[2857]: I0127 05:56:07.714532 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2k2v\" (UniqueName: \"kubernetes.io/projected/8f740da6-d731-4b30-bf8e-ada1ccd8b61b-kube-api-access-h2k2v\") pod \"csi-node-driver-s7bgd\" (UID: \"8f740da6-d731-4b30-bf8e-ada1ccd8b61b\") " pod="calico-system/csi-node-driver-s7bgd" Jan 27 05:56:07.719020 kubelet[2857]: E0127 05:56:07.718537 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.719020 kubelet[2857]: W0127 05:56:07.718560 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.719020 kubelet[2857]: E0127 05:56:07.718606 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.719537 kubelet[2857]: E0127 05:56:07.718987 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.719537 kubelet[2857]: W0127 05:56:07.719474 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.719537 kubelet[2857]: E0127 05:56:07.719494 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.722643 kubelet[2857]: E0127 05:56:07.721945 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.723104 kubelet[2857]: W0127 05:56:07.722820 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.723104 kubelet[2857]: E0127 05:56:07.722849 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.724376 kubelet[2857]: E0127 05:56:07.724079 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.724376 kubelet[2857]: W0127 05:56:07.724098 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.724376 kubelet[2857]: E0127 05:56:07.724115 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.726468 kubelet[2857]: E0127 05:56:07.726448 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.726778 kubelet[2857]: W0127 05:56:07.726610 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.726778 kubelet[2857]: E0127 05:56:07.726637 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.728418 kubelet[2857]: E0127 05:56:07.728147 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.728545 kubelet[2857]: W0127 05:56:07.728523 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.728689 kubelet[2857]: E0127 05:56:07.728652 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.815483 kubelet[2857]: E0127 05:56:07.815436 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.815483 kubelet[2857]: W0127 05:56:07.815473 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.815784 kubelet[2857]: E0127 05:56:07.815506 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.816039 kubelet[2857]: E0127 05:56:07.815919 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.816039 kubelet[2857]: W0127 05:56:07.815938 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.816039 kubelet[2857]: E0127 05:56:07.815976 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.816834 kubelet[2857]: E0127 05:56:07.816709 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.816834 kubelet[2857]: W0127 05:56:07.816726 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.816834 kubelet[2857]: E0127 05:56:07.816745 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.817519 kubelet[2857]: E0127 05:56:07.817424 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.817519 kubelet[2857]: W0127 05:56:07.817440 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.817519 kubelet[2857]: E0127 05:56:07.817476 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.817819 kubelet[2857]: E0127 05:56:07.817792 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.817819 kubelet[2857]: W0127 05:56:07.817820 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.817962 kubelet[2857]: E0127 05:56:07.817858 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.818566 kubelet[2857]: E0127 05:56:07.818517 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.818566 kubelet[2857]: W0127 05:56:07.818562 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.818720 kubelet[2857]: E0127 05:56:07.818607 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.819088 kubelet[2857]: E0127 05:56:07.819056 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.819088 kubelet[2857]: W0127 05:56:07.819076 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.819579 kubelet[2857]: E0127 05:56:07.819141 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.819855 kubelet[2857]: E0127 05:56:07.819667 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.819855 kubelet[2857]: W0127 05:56:07.819684 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.819855 kubelet[2857]: E0127 05:56:07.819736 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.820175 kubelet[2857]: E0127 05:56:07.820125 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.820175 kubelet[2857]: W0127 05:56:07.820176 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.820313 kubelet[2857]: E0127 05:56:07.820213 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.820592 kubelet[2857]: E0127 05:56:07.820567 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.820592 kubelet[2857]: W0127 05:56:07.820589 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.820732 kubelet[2857]: E0127 05:56:07.820611 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.820995 kubelet[2857]: E0127 05:56:07.820913 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.820995 kubelet[2857]: W0127 05:56:07.820934 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.821136 kubelet[2857]: E0127 05:56:07.821025 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.821736 kubelet[2857]: E0127 05:56:07.821653 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.821736 kubelet[2857]: W0127 05:56:07.821673 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.822433 kubelet[2857]: E0127 05:56:07.822390 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.823409 kubelet[2857]: E0127 05:56:07.823353 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.824188 kubelet[2857]: W0127 05:56:07.823393 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.824188 kubelet[2857]: E0127 05:56:07.823879 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.824340 kubelet[2857]: E0127 05:56:07.824236 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.824340 kubelet[2857]: W0127 05:56:07.824250 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.824340 kubelet[2857]: E0127 05:56:07.824277 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.824684 kubelet[2857]: E0127 05:56:07.824589 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.824684 kubelet[2857]: W0127 05:56:07.824604 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.825141 kubelet[2857]: E0127 05:56:07.824872 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.825141 kubelet[2857]: W0127 05:56:07.824890 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.825283 kubelet[2857]: E0127 05:56:07.825184 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.825283 kubelet[2857]: W0127 05:56:07.825197 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.825283 kubelet[2857]: E0127 05:56:07.825216 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.825843 kubelet[2857]: E0127 05:56:07.825520 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.825843 kubelet[2857]: W0127 05:56:07.825533 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.825843 kubelet[2857]: E0127 05:56:07.825549 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.825843 kubelet[2857]: E0127 05:56:07.825805 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.825843 kubelet[2857]: W0127 05:56:07.825820 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.825843 kubelet[2857]: E0127 05:56:07.825835 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.826236 kubelet[2857]: E0127 05:56:07.826135 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.826236 kubelet[2857]: W0127 05:56:07.826149 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.826236 kubelet[2857]: E0127 05:56:07.826164 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.827225 kubelet[2857]: E0127 05:56:07.826336 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.827225 kubelet[2857]: E0127 05:56:07.826910 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.827225 kubelet[2857]: W0127 05:56:07.826965 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.827225 kubelet[2857]: E0127 05:56:07.826983 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.827225 kubelet[2857]: E0127 05:56:07.827051 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.828633 kubelet[2857]: E0127 05:56:07.828609 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.828633 kubelet[2857]: W0127 05:56:07.828632 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.828789 kubelet[2857]: E0127 05:56:07.828649 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.829787 kubelet[2857]: E0127 05:56:07.829758 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.829787 kubelet[2857]: W0127 05:56:07.829785 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.829920 kubelet[2857]: E0127 05:56:07.829815 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.830767 kubelet[2857]: E0127 05:56:07.830635 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.830767 kubelet[2857]: W0127 05:56:07.830767 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.830911 kubelet[2857]: E0127 05:56:07.830786 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:07.831488 kubelet[2857]: E0127 05:56:07.831464 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:07.831488 kubelet[2857]: W0127 05:56:07.831485 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:07.831637 kubelet[2857]: E0127 05:56:07.831504 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.416000 audit[3343]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:08.416000 audit[3343]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffed8ea03c0 a2=0 a3=7ffed8ea03ac items=0 ppid=2979 pid=3343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:08.416000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:08.418000 audit[3343]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:08.418000 audit[3343]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffed8ea03c0 a2=0 a3=0 items=0 ppid=2979 pid=3343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:08.418000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:08.504155 kubelet[2857]: E0127 05:56:08.504088 2857 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.505055 kubelet[2857]: E0127 05:56:08.504102 2857 secret.go:189] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Jan 27 05:56:08.505055 kubelet[2857]: E0127 05:56:08.504241 2857 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-tigera-ca-bundle podName:41dd3ec0-1adc-4e72-8f10-6697a55cd87e nodeName:}" failed. No retries permitted until 2026-01-27 05:56:09.004186392 +0000 UTC m=+24.349187709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-tigera-ca-bundle") pod "calico-typha-6b579b76c6-kwq9s" (UID: "41dd3ec0-1adc-4e72-8f10-6697a55cd87e") : failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.505055 kubelet[2857]: E0127 05:56:08.504279 2857 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-typha-certs podName:41dd3ec0-1adc-4e72-8f10-6697a55cd87e nodeName:}" failed. No retries permitted until 2026-01-27 05:56:09.004256174 +0000 UTC m=+24.349257488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-typha-certs") pod "calico-typha-6b579b76c6-kwq9s" (UID: "41dd3ec0-1adc-4e72-8f10-6697a55cd87e") : failed to sync secret cache: timed out waiting for the condition Jan 27 05:56:08.515922 kubelet[2857]: E0127 05:56:08.515878 2857 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.515922 kubelet[2857]: E0127 05:56:08.515917 2857 projected.go:194] Error preparing data for projected volume kube-api-access-nt8df for pod calico-system/calico-typha-6b579b76c6-kwq9s: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.516097 kubelet[2857]: E0127 05:56:08.516003 2857 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-kube-api-access-nt8df podName:41dd3ec0-1adc-4e72-8f10-6697a55cd87e nodeName:}" failed. No retries permitted until 2026-01-27 05:56:09.015979608 +0000 UTC m=+24.360980925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nt8df" (UniqueName: "kubernetes.io/projected/41dd3ec0-1adc-4e72-8f10-6697a55cd87e-kube-api-access-nt8df") pod "calico-typha-6b579b76c6-kwq9s" (UID: "41dd3ec0-1adc-4e72-8f10-6697a55cd87e") : failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.529101 kubelet[2857]: E0127 05:56:08.529047 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.529101 kubelet[2857]: W0127 05:56:08.529075 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.529263 kubelet[2857]: E0127 05:56:08.529110 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.529541 kubelet[2857]: E0127 05:56:08.529515 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.529541 kubelet[2857]: W0127 05:56:08.529538 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.529806 kubelet[2857]: E0127 05:56:08.529562 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.529954 kubelet[2857]: E0127 05:56:08.529933 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.529954 kubelet[2857]: W0127 05:56:08.529952 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.530077 kubelet[2857]: E0127 05:56:08.529973 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.604392 kubelet[2857]: E0127 05:56:08.604252 2857 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.604769 kubelet[2857]: E0127 05:56:08.604635 2857 secret.go:189] Couldn't get secret calico-system/node-certs: failed to sync secret cache: timed out waiting for the condition Jan 27 05:56:08.604769 kubelet[2857]: E0127 05:56:08.604720 2857 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-tigera-ca-bundle podName:b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4 nodeName:}" failed. No retries permitted until 2026-01-27 05:56:09.104328132 +0000 UTC m=+24.449329442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-tigera-ca-bundle") pod "calico-node-vk4rp" (UID: "b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4") : failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.605316 kubelet[2857]: E0127 05:56:08.605296 2857 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-node-certs podName:b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4 nodeName:}" failed. No retries permitted until 2026-01-27 05:56:09.104732898 +0000 UTC m=+24.449734207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-certs" (UniqueName: "kubernetes.io/secret/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-node-certs") pod "calico-node-vk4rp" (UID: "b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4") : failed to sync secret cache: timed out waiting for the condition Jan 27 05:56:08.614004 kubelet[2857]: E0127 05:56:08.613971 2857 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.614004 kubelet[2857]: E0127 05:56:08.614009 2857 projected.go:194] Error preparing data for projected volume kube-api-access-nt44t for pod calico-system/calico-node-vk4rp: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.614173 kubelet[2857]: E0127 05:56:08.614080 2857 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-kube-api-access-nt44t podName:b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4 nodeName:}" failed. No retries permitted until 2026-01-27 05:56:09.114058517 +0000 UTC m=+24.459059828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nt44t" (UniqueName: "kubernetes.io/projected/b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4-kube-api-access-nt44t") pod "calico-node-vk4rp" (UID: "b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4") : failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.631404 kubelet[2857]: E0127 05:56:08.631352 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.631404 kubelet[2857]: W0127 05:56:08.631398 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.631630 kubelet[2857]: E0127 05:56:08.631425 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.631772 kubelet[2857]: E0127 05:56:08.631748 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.631772 kubelet[2857]: W0127 05:56:08.631769 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.631920 kubelet[2857]: E0127 05:56:08.631788 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.632259 kubelet[2857]: E0127 05:56:08.632082 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.632259 kubelet[2857]: W0127 05:56:08.632102 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.632259 kubelet[2857]: E0127 05:56:08.632120 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.632556 kubelet[2857]: E0127 05:56:08.632533 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.632556 kubelet[2857]: W0127 05:56:08.632553 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.632680 kubelet[2857]: E0127 05:56:08.632571 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.632877 kubelet[2857]: E0127 05:56:08.632859 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.632877 kubelet[2857]: W0127 05:56:08.632875 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.633014 kubelet[2857]: E0127 05:56:08.632890 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.633211 kubelet[2857]: E0127 05:56:08.633190 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.633211 kubelet[2857]: W0127 05:56:08.633209 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.633336 kubelet[2857]: E0127 05:56:08.633225 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.734262 kubelet[2857]: E0127 05:56:08.734212 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.734262 kubelet[2857]: W0127 05:56:08.734246 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.734575 kubelet[2857]: E0127 05:56:08.734278 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.734691 kubelet[2857]: E0127 05:56:08.734667 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.734691 kubelet[2857]: W0127 05:56:08.734689 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.734808 kubelet[2857]: E0127 05:56:08.734709 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.735128 kubelet[2857]: E0127 05:56:08.735103 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.735128 kubelet[2857]: W0127 05:56:08.735124 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.735330 kubelet[2857]: E0127 05:56:08.735142 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.735534 kubelet[2857]: E0127 05:56:08.735514 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.735534 kubelet[2857]: W0127 05:56:08.735531 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.735650 kubelet[2857]: E0127 05:56:08.735549 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.735938 kubelet[2857]: E0127 05:56:08.735901 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.735938 kubelet[2857]: W0127 05:56:08.735931 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.736245 kubelet[2857]: E0127 05:56:08.735950 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.736336 kubelet[2857]: E0127 05:56:08.736272 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.736336 kubelet[2857]: W0127 05:56:08.736285 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.736336 kubelet[2857]: E0127 05:56:08.736303 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.837308 kubelet[2857]: E0127 05:56:08.837269 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.837308 kubelet[2857]: W0127 05:56:08.837297 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.837615 kubelet[2857]: E0127 05:56:08.837326 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.837700 kubelet[2857]: E0127 05:56:08.837669 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.837700 kubelet[2857]: W0127 05:56:08.837688 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.838303 kubelet[2857]: E0127 05:56:08.837708 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.838303 kubelet[2857]: E0127 05:56:08.837839 2857 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.838303 kubelet[2857]: E0127 05:56:08.837868 2857 projected.go:194] Error preparing data for projected volume kube-api-access-h2k2v for pod calico-system/csi-node-driver-s7bgd: failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.838303 kubelet[2857]: E0127 05:56:08.837939 2857 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f740da6-d731-4b30-bf8e-ada1ccd8b61b-kube-api-access-h2k2v podName:8f740da6-d731-4b30-bf8e-ada1ccd8b61b nodeName:}" failed. No retries permitted until 2026-01-27 05:56:09.337916449 +0000 UTC m=+24.682917767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h2k2v" (UniqueName: "kubernetes.io/projected/8f740da6-d731-4b30-bf8e-ada1ccd8b61b-kube-api-access-h2k2v") pod "csi-node-driver-s7bgd" (UID: "8f740da6-d731-4b30-bf8e-ada1ccd8b61b") : failed to sync configmap cache: timed out waiting for the condition Jan 27 05:56:08.838303 kubelet[2857]: E0127 05:56:08.837994 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.838303 kubelet[2857]: W0127 05:56:08.838006 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.838303 kubelet[2857]: E0127 05:56:08.838021 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.839268 kubelet[2857]: E0127 05:56:08.839154 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.839268 kubelet[2857]: W0127 05:56:08.839199 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.839268 kubelet[2857]: E0127 05:56:08.839222 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.840136 kubelet[2857]: E0127 05:56:08.840082 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.840275 kubelet[2857]: W0127 05:56:08.840136 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.840275 kubelet[2857]: E0127 05:56:08.840160 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.840568 kubelet[2857]: E0127 05:56:08.840526 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.840568 kubelet[2857]: W0127 05:56:08.840545 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.840568 kubelet[2857]: E0127 05:56:08.840562 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.941586 kubelet[2857]: E0127 05:56:08.941539 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.941586 kubelet[2857]: W0127 05:56:08.941572 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.941847 kubelet[2857]: E0127 05:56:08.941603 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.942442 kubelet[2857]: E0127 05:56:08.942413 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.942442 kubelet[2857]: W0127 05:56:08.942442 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.942632 kubelet[2857]: E0127 05:56:08.942492 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.942946 kubelet[2857]: E0127 05:56:08.942922 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.942946 kubelet[2857]: W0127 05:56:08.942945 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.943508 kubelet[2857]: E0127 05:56:08.942964 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.944412 kubelet[2857]: E0127 05:56:08.943619 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.944522 kubelet[2857]: W0127 05:56:08.944424 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.944522 kubelet[2857]: E0127 05:56:08.944445 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.944892 kubelet[2857]: E0127 05:56:08.944829 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.945256 kubelet[2857]: W0127 05:56:08.945000 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.945256 kubelet[2857]: E0127 05:56:08.945027 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.946141 kubelet[2857]: E0127 05:56:08.945973 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.946141 kubelet[2857]: W0127 05:56:08.946075 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.946767 kubelet[2857]: E0127 05:56:08.946545 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:08.948158 kubelet[2857]: E0127 05:56:08.947559 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:08.948158 kubelet[2857]: W0127 05:56:08.947582 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:08.948158 kubelet[2857]: E0127 05:56:08.947602 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.049769 kubelet[2857]: E0127 05:56:09.048457 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.049769 kubelet[2857]: W0127 05:56:09.048485 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.049769 kubelet[2857]: E0127 05:56:09.048514 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.049769 kubelet[2857]: E0127 05:56:09.048834 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.049769 kubelet[2857]: W0127 05:56:09.048849 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.049769 kubelet[2857]: E0127 05:56:09.048866 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.049769 kubelet[2857]: E0127 05:56:09.049159 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.049769 kubelet[2857]: W0127 05:56:09.049172 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.049769 kubelet[2857]: E0127 05:56:09.049186 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.049769 kubelet[2857]: E0127 05:56:09.049469 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.050590 kubelet[2857]: W0127 05:56:09.049482 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.050590 kubelet[2857]: E0127 05:56:09.049498 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.050590 kubelet[2857]: E0127 05:56:09.049843 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.050590 kubelet[2857]: W0127 05:56:09.049857 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.050590 kubelet[2857]: E0127 05:56:09.049891 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.050590 kubelet[2857]: E0127 05:56:09.050241 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.050590 kubelet[2857]: W0127 05:56:09.050254 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.050590 kubelet[2857]: E0127 05:56:09.050289 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.051915 kubelet[2857]: E0127 05:56:09.050666 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.051915 kubelet[2857]: W0127 05:56:09.050683 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.051915 kubelet[2857]: E0127 05:56:09.050710 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.051915 kubelet[2857]: E0127 05:56:09.051014 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.051915 kubelet[2857]: W0127 05:56:09.051030 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.051915 kubelet[2857]: E0127 05:56:09.051065 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.051915 kubelet[2857]: E0127 05:56:09.051457 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.051915 kubelet[2857]: W0127 05:56:09.051473 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.051915 kubelet[2857]: E0127 05:56:09.051605 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.051915 kubelet[2857]: E0127 05:56:09.051857 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.053828 kubelet[2857]: W0127 05:56:09.051870 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.053828 kubelet[2857]: E0127 05:56:09.052111 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.053828 kubelet[2857]: E0127 05:56:09.052203 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.053828 kubelet[2857]: W0127 05:56:09.052215 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.053828 kubelet[2857]: E0127 05:56:09.052249 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.053828 kubelet[2857]: E0127 05:56:09.052524 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.053828 kubelet[2857]: W0127 05:56:09.052538 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.053828 kubelet[2857]: E0127 05:56:09.052630 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.053828 kubelet[2857]: E0127 05:56:09.052902 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.053828 kubelet[2857]: W0127 05:56:09.052915 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.054310 kubelet[2857]: E0127 05:56:09.052937 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.054310 kubelet[2857]: E0127 05:56:09.054166 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.054310 kubelet[2857]: W0127 05:56:09.054183 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.054310 kubelet[2857]: E0127 05:56:09.054203 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.055666 kubelet[2857]: E0127 05:56:09.054524 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.055666 kubelet[2857]: W0127 05:56:09.054537 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.055666 kubelet[2857]: E0127 05:56:09.054553 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.055666 kubelet[2857]: E0127 05:56:09.055248 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.055666 kubelet[2857]: W0127 05:56:09.055263 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.055666 kubelet[2857]: E0127 05:56:09.055280 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.056514 kubelet[2857]: E0127 05:56:09.056335 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.056609 kubelet[2857]: W0127 05:56:09.056531 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.056609 kubelet[2857]: E0127 05:56:09.056557 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.057511 kubelet[2857]: E0127 05:56:09.057487 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.057511 kubelet[2857]: W0127 05:56:09.057507 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.057653 kubelet[2857]: E0127 05:56:09.057525 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.058661 kubelet[2857]: E0127 05:56:09.058516 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.058661 kubelet[2857]: W0127 05:56:09.058537 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.058661 kubelet[2857]: E0127 05:56:09.058555 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.066389 kubelet[2857]: E0127 05:56:09.064695 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.066389 kubelet[2857]: W0127 05:56:09.064714 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.066389 kubelet[2857]: E0127 05:56:09.064732 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.068726 kubelet[2857]: E0127 05:56:09.067472 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.068726 kubelet[2857]: W0127 05:56:09.067492 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.068726 kubelet[2857]: E0127 05:56:09.067516 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.068726 kubelet[2857]: E0127 05:56:09.067812 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.068726 kubelet[2857]: W0127 05:56:09.067823 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.068726 kubelet[2857]: E0127 05:56:09.067837 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.153592 kubelet[2857]: E0127 05:56:09.153552 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.153592 kubelet[2857]: W0127 05:56:09.153583 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.153592 kubelet[2857]: E0127 05:56:09.153614 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.154015 kubelet[2857]: E0127 05:56:09.153986 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.154015 kubelet[2857]: W0127 05:56:09.154007 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.154246 kubelet[2857]: E0127 05:56:09.154031 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.154458 kubelet[2857]: E0127 05:56:09.154437 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.154458 kubelet[2857]: W0127 05:56:09.154456 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.154577 kubelet[2857]: E0127 05:56:09.154493 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.154848 kubelet[2857]: E0127 05:56:09.154827 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.154848 kubelet[2857]: W0127 05:56:09.154848 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.155156 kubelet[2857]: E0127 05:56:09.154873 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.155232 kubelet[2857]: E0127 05:56:09.155200 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.155232 kubelet[2857]: W0127 05:56:09.155215 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.155330 kubelet[2857]: E0127 05:56:09.155248 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.155621 kubelet[2857]: E0127 05:56:09.155596 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.155621 kubelet[2857]: W0127 05:56:09.155619 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.155791 kubelet[2857]: E0127 05:56:09.155645 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.156047 kubelet[2857]: E0127 05:56:09.156024 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.156047 kubelet[2857]: W0127 05:56:09.156044 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.156306 kubelet[2857]: E0127 05:56:09.156282 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.156602 kubelet[2857]: E0127 05:56:09.156440 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.156602 kubelet[2857]: W0127 05:56:09.156454 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.156953 kubelet[2857]: E0127 05:56:09.156922 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.157078 kubelet[2857]: W0127 05:56:09.157058 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.157188 kubelet[2857]: E0127 05:56:09.157173 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.157566 kubelet[2857]: E0127 05:56:09.157539 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.157815 kubelet[2857]: W0127 05:56:09.157674 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.157815 kubelet[2857]: E0127 05:56:09.157701 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.158140 kubelet[2857]: E0127 05:56:09.158120 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.158395 kubelet[2857]: W0127 05:56:09.158241 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.158395 kubelet[2857]: E0127 05:56:09.158266 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.158793 kubelet[2857]: E0127 05:56:09.158773 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.158912 kubelet[2857]: W0127 05:56:09.158895 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.159024 kubelet[2857]: E0127 05:56:09.159007 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.164392 kubelet[2857]: E0127 05:56:09.160278 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.164392 kubelet[2857]: W0127 05:56:09.161273 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.164392 kubelet[2857]: E0127 05:56:09.161295 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.164392 kubelet[2857]: E0127 05:56:09.161203 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.164392 kubelet[2857]: E0127 05:56:09.161677 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.164392 kubelet[2857]: W0127 05:56:09.161692 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.164392 kubelet[2857]: E0127 05:56:09.161712 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.164392 kubelet[2857]: E0127 05:56:09.161970 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.164392 kubelet[2857]: W0127 05:56:09.161984 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.164392 kubelet[2857]: E0127 05:56:09.162000 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.164968 kubelet[2857]: E0127 05:56:09.162258 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.164968 kubelet[2857]: W0127 05:56:09.162270 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.164968 kubelet[2857]: E0127 05:56:09.162285 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.164968 kubelet[2857]: E0127 05:56:09.162578 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.164968 kubelet[2857]: W0127 05:56:09.162591 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.164968 kubelet[2857]: E0127 05:56:09.162608 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.164968 kubelet[2857]: E0127 05:56:09.164540 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.164968 kubelet[2857]: W0127 05:56:09.164555 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.164968 kubelet[2857]: E0127 05:56:09.164574 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.168796 kubelet[2857]: E0127 05:56:09.168207 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.168796 kubelet[2857]: W0127 05:56:09.168231 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.168796 kubelet[2857]: E0127 05:56:09.168248 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.195876 containerd[1613]: time="2026-01-27T05:56:09.195780581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b579b76c6-kwq9s,Uid:41dd3ec0-1adc-4e72-8f10-6697a55cd87e,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:09.227732 containerd[1613]: time="2026-01-27T05:56:09.227665855Z" level=info msg="connecting to shim 27799d77cbf98a4360ab37731103918bb933f0ed3c9ae11207ca9590d93a416b" address="unix:///run/containerd/s/c221926b5dc23fe19ac018061660d580a564a0c3867b6d5568f515da284ec233" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:09.262871 kubelet[2857]: E0127 05:56:09.262740 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.262871 kubelet[2857]: W0127 05:56:09.262767 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.262871 kubelet[2857]: E0127 05:56:09.262794 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.266651 systemd[1]: Started cri-containerd-27799d77cbf98a4360ab37731103918bb933f0ed3c9ae11207ca9590d93a416b.scope - libcontainer container 27799d77cbf98a4360ab37731103918bb933f0ed3c9ae11207ca9590d93a416b. Jan 27 05:56:09.285000 audit: BPF prog-id=152 op=LOAD Jan 27 05:56:09.287631 containerd[1613]: time="2026-01-27T05:56:09.287097388Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vk4rp,Uid:b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:09.286000 audit: BPF prog-id=153 op=LOAD Jan 27 05:56:09.286000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237373939643737636266393861343336306162333737333131303339 Jan 27 05:56:09.287000 audit: BPF prog-id=153 op=UNLOAD Jan 27 05:56:09.287000 audit[3439]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237373939643737636266393861343336306162333737333131303339 Jan 27 05:56:09.287000 audit: BPF prog-id=154 op=LOAD Jan 27 05:56:09.287000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237373939643737636266393861343336306162333737333131303339 Jan 27 05:56:09.287000 audit: BPF prog-id=155 op=LOAD Jan 27 05:56:09.287000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237373939643737636266393861343336306162333737333131303339 Jan 27 05:56:09.287000 audit: BPF prog-id=155 op=UNLOAD Jan 27 05:56:09.287000 audit[3439]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237373939643737636266393861343336306162333737333131303339 Jan 27 05:56:09.287000 audit: BPF prog-id=154 op=UNLOAD Jan 27 05:56:09.287000 audit[3439]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237373939643737636266393861343336306162333737333131303339 Jan 27 05:56:09.288000 audit: BPF prog-id=156 op=LOAD Jan 27 05:56:09.288000 audit[3439]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=3428 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.288000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237373939643737636266393861343336306162333737333131303339 Jan 27 05:56:09.320777 containerd[1613]: time="2026-01-27T05:56:09.320628758Z" level=info msg="connecting to shim 16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db" address="unix:///run/containerd/s/1ebd1c124047223b7dabc5657e1e41046e1afccb042a3b79219002054e8166d8" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:09.363947 kubelet[2857]: E0127 05:56:09.363803 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.363947 kubelet[2857]: W0127 05:56:09.363831 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.363947 kubelet[2857]: E0127 05:56:09.363982 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.365002 kubelet[2857]: E0127 05:56:09.364890 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.365002 kubelet[2857]: W0127 05:56:09.364909 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.365002 kubelet[2857]: E0127 05:56:09.364956 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.365663 kubelet[2857]: E0127 05:56:09.365575 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.365663 kubelet[2857]: W0127 05:56:09.365594 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.365663 kubelet[2857]: E0127 05:56:09.365618 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.366168 kubelet[2857]: E0127 05:56:09.366128 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.366168 kubelet[2857]: W0127 05:56:09.366148 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.366168 kubelet[2857]: E0127 05:56:09.366168 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.366846 kubelet[2857]: E0127 05:56:09.366824 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.366846 kubelet[2857]: W0127 05:56:09.366847 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.366995 kubelet[2857]: E0127 05:56:09.366867 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.373920 kubelet[2857]: E0127 05:56:09.373705 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:09.373920 kubelet[2857]: W0127 05:56:09.373730 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:09.373920 kubelet[2857]: E0127 05:56:09.373750 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:09.380669 systemd[1]: Started cri-containerd-16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db.scope - libcontainer container 16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db. Jan 27 05:56:09.386019 containerd[1613]: time="2026-01-27T05:56:09.385976248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b579b76c6-kwq9s,Uid:41dd3ec0-1adc-4e72-8f10-6697a55cd87e,Namespace:calico-system,Attempt:0,} returns sandbox id \"27799d77cbf98a4360ab37731103918bb933f0ed3c9ae11207ca9590d93a416b\"" Jan 27 05:56:09.388936 containerd[1613]: time="2026-01-27T05:56:09.388494477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 27 05:56:09.410000 audit: BPF prog-id=157 op=LOAD Jan 27 05:56:09.411000 audit: BPF prog-id=158 op=LOAD Jan 27 05:56:09.411000 audit[3479]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=3468 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303235653763313762656436383933376238623236643835643337 Jan 27 05:56:09.412000 audit: BPF prog-id=158 op=UNLOAD Jan 27 05:56:09.412000 audit[3479]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303235653763313762656436383933376238623236643835643337 Jan 27 05:56:09.412000 audit: BPF prog-id=159 op=LOAD Jan 27 05:56:09.412000 audit[3479]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=3468 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303235653763313762656436383933376238623236643835643337 Jan 27 05:56:09.412000 audit: BPF prog-id=160 op=LOAD Jan 27 05:56:09.412000 audit[3479]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=3468 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303235653763313762656436383933376238623236643835643337 Jan 27 05:56:09.412000 audit: BPF prog-id=160 op=UNLOAD Jan 27 05:56:09.412000 audit[3479]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303235653763313762656436383933376238623236643835643337 Jan 27 05:56:09.413000 audit: BPF prog-id=159 op=UNLOAD Jan 27 05:56:09.413000 audit[3479]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303235653763313762656436383933376238623236643835643337 Jan 27 05:56:09.413000 audit: BPF prog-id=161 op=LOAD Jan 27 05:56:09.413000 audit[3479]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=3468 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:09.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303235653763313762656436383933376238623236643835643337 Jan 27 05:56:09.438090 containerd[1613]: time="2026-01-27T05:56:09.438041055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vk4rp,Uid:b54bb9cd-85af-43b8-82d9-1d8d2e73d3e4,Namespace:calico-system,Attempt:0,} returns sandbox id \"16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db\"" Jan 27 05:56:09.849933 kubelet[2857]: E0127 05:56:09.849832 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:10.359983 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2163936702.mount: Deactivated successfully. Jan 27 05:56:11.335200 containerd[1613]: time="2026-01-27T05:56:11.335130225Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:11.336532 containerd[1613]: time="2026-01-27T05:56:11.336474030Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 27 05:56:11.337854 containerd[1613]: time="2026-01-27T05:56:11.337783592Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:11.340352 containerd[1613]: time="2026-01-27T05:56:11.340289722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:11.341885 containerd[1613]: time="2026-01-27T05:56:11.341158174Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 1.952621194s" Jan 27 05:56:11.341885 containerd[1613]: time="2026-01-27T05:56:11.341201183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 27 05:56:11.342603 containerd[1613]: time="2026-01-27T05:56:11.342571311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 27 05:56:11.367491 containerd[1613]: time="2026-01-27T05:56:11.367418772Z" level=info msg="CreateContainer within sandbox \"27799d77cbf98a4360ab37731103918bb933f0ed3c9ae11207ca9590d93a416b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 27 05:56:11.378619 containerd[1613]: time="2026-01-27T05:56:11.378493716Z" level=info msg="Container fcec349b9b6f625fdf8a78d5bc9addd6c0cd772ac61d4556dd99d636acb6bea8: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:56:11.398755 containerd[1613]: time="2026-01-27T05:56:11.398712813Z" level=info msg="CreateContainer within sandbox \"27799d77cbf98a4360ab37731103918bb933f0ed3c9ae11207ca9590d93a416b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"fcec349b9b6f625fdf8a78d5bc9addd6c0cd772ac61d4556dd99d636acb6bea8\"" Jan 27 05:56:11.399475 containerd[1613]: time="2026-01-27T05:56:11.399405148Z" level=info msg="StartContainer for \"fcec349b9b6f625fdf8a78d5bc9addd6c0cd772ac61d4556dd99d636acb6bea8\"" Jan 27 05:56:11.401811 containerd[1613]: time="2026-01-27T05:56:11.401768390Z" level=info msg="connecting to shim fcec349b9b6f625fdf8a78d5bc9addd6c0cd772ac61d4556dd99d636acb6bea8" address="unix:///run/containerd/s/c221926b5dc23fe19ac018061660d580a564a0c3867b6d5568f515da284ec233" protocol=ttrpc version=3 Jan 27 05:56:11.438612 systemd[1]: Started cri-containerd-fcec349b9b6f625fdf8a78d5bc9addd6c0cd772ac61d4556dd99d636acb6bea8.scope - libcontainer container fcec349b9b6f625fdf8a78d5bc9addd6c0cd772ac61d4556dd99d636acb6bea8. Jan 27 05:56:11.458000 audit: BPF prog-id=162 op=LOAD Jan 27 05:56:11.465145 kernel: kauditd_printk_skb: 64 callbacks suppressed Jan 27 05:56:11.465225 kernel: audit: type=1334 audit(1769493371.458:536): prog-id=162 op=LOAD Jan 27 05:56:11.459000 audit: BPF prog-id=163 op=LOAD Jan 27 05:56:11.459000 audit[3528]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.509654 kernel: audit: type=1334 audit(1769493371.459:537): prog-id=163 op=LOAD Jan 27 05:56:11.509828 kernel: audit: type=1300 audit(1769493371.459:537): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.538470 kernel: audit: type=1327 audit(1769493371.459:537): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.539015 kernel: audit: type=1334 audit(1769493371.459:538): prog-id=163 op=UNLOAD Jan 27 05:56:11.459000 audit: BPF prog-id=163 op=UNLOAD Jan 27 05:56:11.459000 audit[3528]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.574798 kernel: audit: type=1300 audit(1769493371.459:538): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.574923 kernel: audit: type=1327 audit(1769493371.459:538): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.459000 audit: BPF prog-id=164 op=LOAD Jan 27 05:56:11.612465 kernel: audit: type=1334 audit(1769493371.459:539): prog-id=164 op=LOAD Jan 27 05:56:11.459000 audit[3528]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.632715 containerd[1613]: time="2026-01-27T05:56:11.628741236Z" level=info msg="StartContainer for \"fcec349b9b6f625fdf8a78d5bc9addd6c0cd772ac61d4556dd99d636acb6bea8\" returns successfully" Jan 27 05:56:11.670918 kernel: audit: type=1300 audit(1769493371.459:539): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.671132 kernel: audit: type=1327 audit(1769493371.459:539): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.459000 audit: BPF prog-id=165 op=LOAD Jan 27 05:56:11.459000 audit[3528]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.459000 audit: BPF prog-id=165 op=UNLOAD Jan 27 05:56:11.459000 audit[3528]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.459000 audit: BPF prog-id=164 op=UNLOAD Jan 27 05:56:11.459000 audit[3528]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.459000 audit: BPF prog-id=166 op=LOAD Jan 27 05:56:11.459000 audit[3528]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=3428 pid=3528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:11.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663656333343962396236663632356664663861373864356263396164 Jan 27 05:56:11.850590 kubelet[2857]: E0127 05:56:11.850328 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:12.053390 kubelet[2857]: I0127 05:56:12.052901 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b579b76c6-kwq9s" podStartSLOduration=3.098463626 podStartE2EDuration="5.052877327s" podCreationTimestamp="2026-01-27 05:56:07 +0000 UTC" firstStartedPulling="2026-01-27 05:56:09.388005487 +0000 UTC m=+24.733006793" lastFinishedPulling="2026-01-27 05:56:11.342419178 +0000 UTC m=+26.687420494" observedRunningTime="2026-01-27 05:56:12.052289453 +0000 UTC m=+27.397290780" watchObservedRunningTime="2026-01-27 05:56:12.052877327 +0000 UTC m=+27.397878650" Jan 27 05:56:12.116490 kubelet[2857]: E0127 05:56:12.116448 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.116731 kubelet[2857]: W0127 05:56:12.116703 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.116962 kubelet[2857]: E0127 05:56:12.116849 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.117216 kubelet[2857]: E0127 05:56:12.117181 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.117216 kubelet[2857]: W0127 05:56:12.117202 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.117429 kubelet[2857]: E0127 05:56:12.117222 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.117558 kubelet[2857]: E0127 05:56:12.117530 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.117558 kubelet[2857]: W0127 05:56:12.117549 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.117742 kubelet[2857]: E0127 05:56:12.117567 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.118142 kubelet[2857]: E0127 05:56:12.117957 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.118142 kubelet[2857]: W0127 05:56:12.117977 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.118142 kubelet[2857]: E0127 05:56:12.117995 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.118401 kubelet[2857]: E0127 05:56:12.118349 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.118496 kubelet[2857]: W0127 05:56:12.118393 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.118496 kubelet[2857]: E0127 05:56:12.118424 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.118855 kubelet[2857]: E0127 05:56:12.118808 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.118855 kubelet[2857]: W0127 05:56:12.118826 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.118855 kubelet[2857]: E0127 05:56:12.118848 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.119178 kubelet[2857]: E0127 05:56:12.119155 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.119450 kubelet[2857]: W0127 05:56:12.119261 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.119450 kubelet[2857]: E0127 05:56:12.119285 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.119667 kubelet[2857]: E0127 05:56:12.119614 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.119667 kubelet[2857]: W0127 05:56:12.119628 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.119667 kubelet[2857]: E0127 05:56:12.119645 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.119988 kubelet[2857]: E0127 05:56:12.119967 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.119988 kubelet[2857]: W0127 05:56:12.119985 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.120206 kubelet[2857]: E0127 05:56:12.120001 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.120414 kubelet[2857]: E0127 05:56:12.120273 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.120414 kubelet[2857]: W0127 05:56:12.120287 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.120414 kubelet[2857]: E0127 05:56:12.120303 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.120595 kubelet[2857]: E0127 05:56:12.120586 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.120645 kubelet[2857]: W0127 05:56:12.120599 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.120645 kubelet[2857]: E0127 05:56:12.120615 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.121061 kubelet[2857]: E0127 05:56:12.120883 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.121061 kubelet[2857]: W0127 05:56:12.120907 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.121061 kubelet[2857]: E0127 05:56:12.120923 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.121248 kubelet[2857]: E0127 05:56:12.121185 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.121248 kubelet[2857]: W0127 05:56:12.121198 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.121248 kubelet[2857]: E0127 05:56:12.121214 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.121877 kubelet[2857]: E0127 05:56:12.121523 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.121877 kubelet[2857]: W0127 05:56:12.121540 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.121877 kubelet[2857]: E0127 05:56:12.121556 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.121877 kubelet[2857]: E0127 05:56:12.121826 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.121877 kubelet[2857]: W0127 05:56:12.121838 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.121877 kubelet[2857]: E0127 05:56:12.121852 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.185266 kubelet[2857]: E0127 05:56:12.185225 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.185266 kubelet[2857]: W0127 05:56:12.185254 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.185762 kubelet[2857]: E0127 05:56:12.185282 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.185762 kubelet[2857]: E0127 05:56:12.185747 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.186071 kubelet[2857]: W0127 05:56:12.185774 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.186071 kubelet[2857]: E0127 05:56:12.185838 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.186518 kubelet[2857]: E0127 05:56:12.186491 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.186518 kubelet[2857]: W0127 05:56:12.186515 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.186698 kubelet[2857]: E0127 05:56:12.186548 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.186902 kubelet[2857]: E0127 05:56:12.186878 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.186902 kubelet[2857]: W0127 05:56:12.186899 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.187165 kubelet[2857]: E0127 05:56:12.186924 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.187458 kubelet[2857]: E0127 05:56:12.187323 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.187458 kubelet[2857]: W0127 05:56:12.187343 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.187458 kubelet[2857]: E0127 05:56:12.187400 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.187670 kubelet[2857]: E0127 05:56:12.187648 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.187670 kubelet[2857]: W0127 05:56:12.187667 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.187919 kubelet[2857]: E0127 05:56:12.187734 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.188228 kubelet[2857]: E0127 05:56:12.188206 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.188228 kubelet[2857]: W0127 05:56:12.188224 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.188515 kubelet[2857]: E0127 05:56:12.188433 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.188623 kubelet[2857]: E0127 05:56:12.188590 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.188623 kubelet[2857]: W0127 05:56:12.188605 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.188723 kubelet[2857]: E0127 05:56:12.188628 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.189475 kubelet[2857]: E0127 05:56:12.189403 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.189475 kubelet[2857]: W0127 05:56:12.189425 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.189475 kubelet[2857]: E0127 05:56:12.189447 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.190042 kubelet[2857]: E0127 05:56:12.189727 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.190042 kubelet[2857]: W0127 05:56:12.189744 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.190042 kubelet[2857]: E0127 05:56:12.189814 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.190291 kubelet[2857]: E0127 05:56:12.190219 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.190291 kubelet[2857]: W0127 05:56:12.190234 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.190291 kubelet[2857]: E0127 05:56:12.190264 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.191109 kubelet[2857]: E0127 05:56:12.191069 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.191109 kubelet[2857]: W0127 05:56:12.191092 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.191235 kubelet[2857]: E0127 05:56:12.191121 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.192667 kubelet[2857]: E0127 05:56:12.192608 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.192667 kubelet[2857]: W0127 05:56:12.192628 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.192667 kubelet[2857]: E0127 05:56:12.192656 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.194317 kubelet[2857]: E0127 05:56:12.194293 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.194422 kubelet[2857]: W0127 05:56:12.194324 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.194422 kubelet[2857]: E0127 05:56:12.194343 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.195389 kubelet[2857]: E0127 05:56:12.194637 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.195389 kubelet[2857]: W0127 05:56:12.194661 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.195389 kubelet[2857]: E0127 05:56:12.194677 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.195389 kubelet[2857]: E0127 05:56:12.194972 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.195389 kubelet[2857]: W0127 05:56:12.194984 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.195389 kubelet[2857]: E0127 05:56:12.194999 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.195733 kubelet[2857]: E0127 05:56:12.195485 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.195733 kubelet[2857]: W0127 05:56:12.195499 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.195733 kubelet[2857]: E0127 05:56:12.195516 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.197131 kubelet[2857]: E0127 05:56:12.197104 2857 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 05:56:12.197131 kubelet[2857]: W0127 05:56:12.197128 2857 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 05:56:12.197307 kubelet[2857]: E0127 05:56:12.197146 2857 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 05:56:12.367000 audit[3608]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=3608 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:12.367000 audit[3608]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc6c5140c0 a2=0 a3=7ffc6c5140ac items=0 ppid=2979 pid=3608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:12.367000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:12.372000 audit[3608]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=3608 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:12.372000 audit[3608]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffc6c5140c0 a2=0 a3=7ffc6c5140ac items=0 ppid=2979 pid=3608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:12.372000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:12.435806 containerd[1613]: time="2026-01-27T05:56:12.435728216Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:12.436930 containerd[1613]: time="2026-01-27T05:56:12.436890040Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:12.438267 containerd[1613]: time="2026-01-27T05:56:12.438202035Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:12.441259 containerd[1613]: time="2026-01-27T05:56:12.441143950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:12.442416 containerd[1613]: time="2026-01-27T05:56:12.442147730Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.099532839s" Jan 27 05:56:12.442416 containerd[1613]: time="2026-01-27T05:56:12.442189948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 27 05:56:12.446921 containerd[1613]: time="2026-01-27T05:56:12.446414819Z" level=info msg="CreateContainer within sandbox \"16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 27 05:56:12.460977 containerd[1613]: time="2026-01-27T05:56:12.460801575Z" level=info msg="Container ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:56:12.471473 containerd[1613]: time="2026-01-27T05:56:12.471419179Z" level=info msg="CreateContainer within sandbox \"16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8\"" Jan 27 05:56:12.472908 containerd[1613]: time="2026-01-27T05:56:12.472069199Z" level=info msg="StartContainer for \"ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8\"" Jan 27 05:56:12.474379 containerd[1613]: time="2026-01-27T05:56:12.474321741Z" level=info msg="connecting to shim ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8" address="unix:///run/containerd/s/1ebd1c124047223b7dabc5657e1e41046e1afccb042a3b79219002054e8166d8" protocol=ttrpc version=3 Jan 27 05:56:12.505769 systemd[1]: Started cri-containerd-ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8.scope - libcontainer container ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8. Jan 27 05:56:12.571000 audit: BPF prog-id=167 op=LOAD Jan 27 05:56:12.571000 audit[3609]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3468 pid=3609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:12.571000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563333863376436356331333063323063613030643532396566643864 Jan 27 05:56:12.571000 audit: BPF prog-id=168 op=LOAD Jan 27 05:56:12.571000 audit[3609]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3468 pid=3609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:12.571000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563333863376436356331333063323063613030643532396566643864 Jan 27 05:56:12.572000 audit: BPF prog-id=168 op=UNLOAD Jan 27 05:56:12.572000 audit[3609]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:12.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563333863376436356331333063323063613030643532396566643864 Jan 27 05:56:12.572000 audit: BPF prog-id=167 op=UNLOAD Jan 27 05:56:12.572000 audit[3609]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:12.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563333863376436356331333063323063613030643532396566643864 Jan 27 05:56:12.572000 audit: BPF prog-id=169 op=LOAD Jan 27 05:56:12.572000 audit[3609]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3468 pid=3609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:12.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563333863376436356331333063323063613030643532396566643864 Jan 27 05:56:12.608620 containerd[1613]: time="2026-01-27T05:56:12.608561216Z" level=info msg="StartContainer for \"ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8\" returns successfully" Jan 27 05:56:12.627633 systemd[1]: cri-containerd-ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8.scope: Deactivated successfully. Jan 27 05:56:12.630000 audit: BPF prog-id=169 op=UNLOAD Jan 27 05:56:12.633044 containerd[1613]: time="2026-01-27T05:56:12.632987602Z" level=info msg="received container exit event container_id:\"ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8\" id:\"ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8\" pid:3621 exited_at:{seconds:1769493372 nanos:631320845}" Jan 27 05:56:12.672249 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ec38c7d65c130c20ca00d529efd8d5f3de766d589498b85490fbf2919f3e1dc8-rootfs.mount: Deactivated successfully. Jan 27 05:56:13.849587 kubelet[2857]: E0127 05:56:13.849515 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:14.036073 containerd[1613]: time="2026-01-27T05:56:14.035976079Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 27 05:56:15.850106 kubelet[2857]: E0127 05:56:15.850025 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:17.198252 containerd[1613]: time="2026-01-27T05:56:17.198152835Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:17.199694 containerd[1613]: time="2026-01-27T05:56:17.199642526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 27 05:56:17.200745 containerd[1613]: time="2026-01-27T05:56:17.200657770Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:17.204062 containerd[1613]: time="2026-01-27T05:56:17.203997769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:17.205503 containerd[1613]: time="2026-01-27T05:56:17.204888144Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.168858595s" Jan 27 05:56:17.205503 containerd[1613]: time="2026-01-27T05:56:17.204933272Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 27 05:56:17.208254 containerd[1613]: time="2026-01-27T05:56:17.208207045Z" level=info msg="CreateContainer within sandbox \"16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 27 05:56:17.219961 containerd[1613]: time="2026-01-27T05:56:17.218199746Z" level=info msg="Container 3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:56:17.232082 containerd[1613]: time="2026-01-27T05:56:17.232026143Z" level=info msg="CreateContainer within sandbox \"16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51\"" Jan 27 05:56:17.232822 containerd[1613]: time="2026-01-27T05:56:17.232644676Z" level=info msg="StartContainer for \"3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51\"" Jan 27 05:56:17.235385 containerd[1613]: time="2026-01-27T05:56:17.235335917Z" level=info msg="connecting to shim 3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51" address="unix:///run/containerd/s/1ebd1c124047223b7dabc5657e1e41046e1afccb042a3b79219002054e8166d8" protocol=ttrpc version=3 Jan 27 05:56:17.271686 systemd[1]: Started cri-containerd-3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51.scope - libcontainer container 3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51. Jan 27 05:56:17.345000 audit: BPF prog-id=170 op=LOAD Jan 27 05:56:17.352666 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 27 05:56:17.352820 kernel: audit: type=1334 audit(1769493377.345:552): prog-id=170 op=LOAD Jan 27 05:56:17.345000 audit[3670]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.389220 kernel: audit: type=1300 audit(1769493377.345:552): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.418477 kernel: audit: type=1327 audit(1769493377.345:552): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.418599 kernel: audit: type=1334 audit(1769493377.345:553): prog-id=171 op=LOAD Jan 27 05:56:17.345000 audit: BPF prog-id=171 op=LOAD Jan 27 05:56:17.455337 kernel: audit: type=1300 audit(1769493377.345:553): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.345000 audit[3670]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.492090 kernel: audit: type=1327 audit(1769493377.345:553): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.492228 kernel: audit: type=1334 audit(1769493377.345:554): prog-id=171 op=UNLOAD Jan 27 05:56:17.345000 audit: BPF prog-id=171 op=UNLOAD Jan 27 05:56:17.345000 audit[3670]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.552110 kernel: audit: type=1300 audit(1769493377.345:554): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.552251 kernel: audit: type=1327 audit(1769493377.345:554): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.345000 audit: BPF prog-id=170 op=UNLOAD Jan 27 05:56:17.345000 audit[3670]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.345000 audit: BPF prog-id=172 op=LOAD Jan 27 05:56:17.345000 audit[3670]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3468 pid=3670 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:17.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361633535343430623239653862393331313036626463323565343935 Jan 27 05:56:17.560394 kernel: audit: type=1334 audit(1769493377.345:555): prog-id=170 op=UNLOAD Jan 27 05:56:17.566970 containerd[1613]: time="2026-01-27T05:56:17.566925809Z" level=info msg="StartContainer for \"3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51\" returns successfully" Jan 27 05:56:17.850119 kubelet[2857]: E0127 05:56:17.849926 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:18.532945 systemd[1]: cri-containerd-3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51.scope: Deactivated successfully. Jan 27 05:56:18.533468 systemd[1]: cri-containerd-3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51.scope: Consumed 655ms CPU time, 193M memory peak, 171.3M written to disk. Jan 27 05:56:18.536147 containerd[1613]: time="2026-01-27T05:56:18.535972431Z" level=info msg="received container exit event container_id:\"3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51\" id:\"3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51\" pid:3684 exited_at:{seconds:1769493378 nanos:535699120}" Jan 27 05:56:18.536000 audit: BPF prog-id=172 op=UNLOAD Jan 27 05:56:18.551244 kubelet[2857]: I0127 05:56:18.551208 2857 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 27 05:56:18.581837 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3ac55440b29e8b931106bdc25e495dc7bb92d3dfb9092ec74050e9b4894bee51-rootfs.mount: Deactivated successfully. Jan 27 05:56:18.633216 systemd[1]: Created slice kubepods-burstable-pod72441b5b_b7f3_4945_b29d_35dc24b39ff2.slice - libcontainer container kubepods-burstable-pod72441b5b_b7f3_4945_b29d_35dc24b39ff2.slice. Jan 27 05:56:18.650745 systemd[1]: Created slice kubepods-burstable-pode77345b2_283b_459e_9e5a_f8eabb03a5ae.slice - libcontainer container kubepods-burstable-pode77345b2_283b_459e_9e5a_f8eabb03a5ae.slice. Jan 27 05:56:18.698470 systemd[1]: Created slice kubepods-besteffort-pode4b3467b_adcb_4738_9feb_ff8bcf1c33fe.slice - libcontainer container kubepods-besteffort-pode4b3467b_adcb_4738_9feb_ff8bcf1c33fe.slice. Jan 27 05:56:18.727920 systemd[1]: Created slice kubepods-besteffort-pod13975170_e4d2_41c8_9e0a_f42e4f517791.slice - libcontainer container kubepods-besteffort-pod13975170_e4d2_41c8_9e0a_f42e4f517791.slice. Jan 27 05:56:18.729783 systemd[1]: Created slice kubepods-besteffort-podabcbc1d1_af63_4772_a8eb_6b5783d69e07.slice - libcontainer container kubepods-besteffort-podabcbc1d1_af63_4772_a8eb_6b5783d69e07.slice. Jan 27 05:56:18.745565 systemd[1]: Created slice kubepods-besteffort-pod1d6104d6_b194_464e_a9ff_614dd70f00c4.slice - libcontainer container kubepods-besteffort-pod1d6104d6_b194_464e_a9ff_614dd70f00c4.slice. Jan 27 05:56:18.757219 systemd[1]: Created slice kubepods-besteffort-pod124b40a0_d1a3_4e06_b27d_7331549e3e87.slice - libcontainer container kubepods-besteffort-pod124b40a0_d1a3_4e06_b27d_7331549e3e87.slice. Jan 27 05:56:18.761027 kubelet[2857]: I0127 05:56:18.760989 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72441b5b-b7f3-4945-b29d-35dc24b39ff2-config-volume\") pod \"coredns-668d6bf9bc-sf965\" (UID: \"72441b5b-b7f3-4945-b29d-35dc24b39ff2\") " pod="kube-system/coredns-668d6bf9bc-sf965" Jan 27 05:56:18.761321 kubelet[2857]: I0127 05:56:18.761291 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-694nb\" (UniqueName: \"kubernetes.io/projected/72441b5b-b7f3-4945-b29d-35dc24b39ff2-kube-api-access-694nb\") pod \"coredns-668d6bf9bc-sf965\" (UID: \"72441b5b-b7f3-4945-b29d-35dc24b39ff2\") " pod="kube-system/coredns-668d6bf9bc-sf965" Jan 27 05:56:18.761514 kubelet[2857]: I0127 05:56:18.761490 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jpxn\" (UniqueName: \"kubernetes.io/projected/e4b3467b-adcb-4738-9feb-ff8bcf1c33fe-kube-api-access-2jpxn\") pod \"calico-apiserver-848cb947c-xgb9b\" (UID: \"e4b3467b-adcb-4738-9feb-ff8bcf1c33fe\") " pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" Jan 27 05:56:18.761668 kubelet[2857]: I0127 05:56:18.761644 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e77345b2-283b-459e-9e5a-f8eabb03a5ae-config-volume\") pod \"coredns-668d6bf9bc-6r9xq\" (UID: \"e77345b2-283b-459e-9e5a-f8eabb03a5ae\") " pod="kube-system/coredns-668d6bf9bc-6r9xq" Jan 27 05:56:18.761868 kubelet[2857]: I0127 05:56:18.761842 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e4b3467b-adcb-4738-9feb-ff8bcf1c33fe-calico-apiserver-certs\") pod \"calico-apiserver-848cb947c-xgb9b\" (UID: \"e4b3467b-adcb-4738-9feb-ff8bcf1c33fe\") " pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" Jan 27 05:56:18.762159 kubelet[2857]: I0127 05:56:18.762115 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xvs4\" (UniqueName: \"kubernetes.io/projected/e77345b2-283b-459e-9e5a-f8eabb03a5ae-kube-api-access-6xvs4\") pod \"coredns-668d6bf9bc-6r9xq\" (UID: \"e77345b2-283b-459e-9e5a-f8eabb03a5ae\") " pod="kube-system/coredns-668d6bf9bc-6r9xq" Jan 27 05:56:18.864022 kubelet[2857]: I0127 05:56:18.863884 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46mf\" (UniqueName: \"kubernetes.io/projected/abcbc1d1-af63-4772-a8eb-6b5783d69e07-kube-api-access-h46mf\") pod \"calico-kube-controllers-56c68bdb6-v2r5d\" (UID: \"abcbc1d1-af63-4772-a8eb-6b5783d69e07\") " pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" Jan 27 05:56:18.868391 kubelet[2857]: I0127 05:56:18.866424 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13975170-e4d2-41c8-9e0a-f42e4f517791-config\") pod \"goldmane-666569f655-4mkcq\" (UID: \"13975170-e4d2-41c8-9e0a-f42e4f517791\") " pod="calico-system/goldmane-666569f655-4mkcq" Jan 27 05:56:18.868391 kubelet[2857]: I0127 05:56:18.866478 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4nzs\" (UniqueName: \"kubernetes.io/projected/13975170-e4d2-41c8-9e0a-f42e4f517791-kube-api-access-d4nzs\") pod \"goldmane-666569f655-4mkcq\" (UID: \"13975170-e4d2-41c8-9e0a-f42e4f517791\") " pod="calico-system/goldmane-666569f655-4mkcq" Jan 27 05:56:18.868391 kubelet[2857]: I0127 05:56:18.866568 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-backend-key-pair\") pod \"whisker-7c96655cb7-tj5vx\" (UID: \"1d6104d6-b194-464e-a9ff-614dd70f00c4\") " pod="calico-system/whisker-7c96655cb7-tj5vx" Jan 27 05:56:18.868391 kubelet[2857]: I0127 05:56:18.866597 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtr8q\" (UniqueName: \"kubernetes.io/projected/1d6104d6-b194-464e-a9ff-614dd70f00c4-kube-api-access-rtr8q\") pod \"whisker-7c96655cb7-tj5vx\" (UID: \"1d6104d6-b194-464e-a9ff-614dd70f00c4\") " pod="calico-system/whisker-7c96655cb7-tj5vx" Jan 27 05:56:18.868391 kubelet[2857]: I0127 05:56:18.866677 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/124b40a0-d1a3-4e06-b27d-7331549e3e87-calico-apiserver-certs\") pod \"calico-apiserver-848cb947c-w2cj4\" (UID: \"124b40a0-d1a3-4e06-b27d-7331549e3e87\") " pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" Jan 27 05:56:18.868731 kubelet[2857]: I0127 05:56:18.866718 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abcbc1d1-af63-4772-a8eb-6b5783d69e07-tigera-ca-bundle\") pod \"calico-kube-controllers-56c68bdb6-v2r5d\" (UID: \"abcbc1d1-af63-4772-a8eb-6b5783d69e07\") " pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" Jan 27 05:56:18.868731 kubelet[2857]: I0127 05:56:18.866756 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwtvt\" (UniqueName: \"kubernetes.io/projected/124b40a0-d1a3-4e06-b27d-7331549e3e87-kube-api-access-jwtvt\") pod \"calico-apiserver-848cb947c-w2cj4\" (UID: \"124b40a0-d1a3-4e06-b27d-7331549e3e87\") " pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" Jan 27 05:56:18.868731 kubelet[2857]: I0127 05:56:18.866783 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13975170-e4d2-41c8-9e0a-f42e4f517791-goldmane-ca-bundle\") pod \"goldmane-666569f655-4mkcq\" (UID: \"13975170-e4d2-41c8-9e0a-f42e4f517791\") " pod="calico-system/goldmane-666569f655-4mkcq" Jan 27 05:56:18.868731 kubelet[2857]: I0127 05:56:18.866810 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/13975170-e4d2-41c8-9e0a-f42e4f517791-goldmane-key-pair\") pod \"goldmane-666569f655-4mkcq\" (UID: \"13975170-e4d2-41c8-9e0a-f42e4f517791\") " pod="calico-system/goldmane-666569f655-4mkcq" Jan 27 05:56:18.868731 kubelet[2857]: I0127 05:56:18.866859 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-ca-bundle\") pod \"whisker-7c96655cb7-tj5vx\" (UID: \"1d6104d6-b194-464e-a9ff-614dd70f00c4\") " pod="calico-system/whisker-7c96655cb7-tj5vx" Jan 27 05:56:18.948864 containerd[1613]: time="2026-01-27T05:56:18.948821032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sf965,Uid:72441b5b-b7f3-4945-b29d-35dc24b39ff2,Namespace:kube-system,Attempt:0,}" Jan 27 05:56:18.963395 containerd[1613]: time="2026-01-27T05:56:18.962895222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6r9xq,Uid:e77345b2-283b-459e-9e5a-f8eabb03a5ae,Namespace:kube-system,Attempt:0,}" Jan 27 05:56:19.010450 containerd[1613]: time="2026-01-27T05:56:19.008686577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-xgb9b,Uid:e4b3467b-adcb-4738-9feb-ff8bcf1c33fe,Namespace:calico-apiserver,Attempt:0,}" Jan 27 05:56:19.043910 containerd[1613]: time="2026-01-27T05:56:19.043856750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56c68bdb6-v2r5d,Uid:abcbc1d1-af63-4772-a8eb-6b5783d69e07,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:19.046741 containerd[1613]: time="2026-01-27T05:56:19.046659612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4mkcq,Uid:13975170-e4d2-41c8-9e0a-f42e4f517791,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:19.052274 containerd[1613]: time="2026-01-27T05:56:19.052219727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c96655cb7-tj5vx,Uid:1d6104d6-b194-464e-a9ff-614dd70f00c4,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:19.066068 containerd[1613]: time="2026-01-27T05:56:19.066015101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-w2cj4,Uid:124b40a0-d1a3-4e06-b27d-7331549e3e87,Namespace:calico-apiserver,Attempt:0,}" Jan 27 05:56:19.373506 containerd[1613]: time="2026-01-27T05:56:19.373355833Z" level=error msg="Failed to destroy network for sandbox \"58fea9243e643e1162c64f81884b733a17894d3376e80d2c2d0327c5f9d4142c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.377181 containerd[1613]: time="2026-01-27T05:56:19.377110288Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-w2cj4,Uid:124b40a0-d1a3-4e06-b27d-7331549e3e87,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"58fea9243e643e1162c64f81884b733a17894d3376e80d2c2d0327c5f9d4142c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.378510 kubelet[2857]: E0127 05:56:19.377547 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58fea9243e643e1162c64f81884b733a17894d3376e80d2c2d0327c5f9d4142c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.378510 kubelet[2857]: E0127 05:56:19.377649 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58fea9243e643e1162c64f81884b733a17894d3376e80d2c2d0327c5f9d4142c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" Jan 27 05:56:19.378510 kubelet[2857]: E0127 05:56:19.377685 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58fea9243e643e1162c64f81884b733a17894d3376e80d2c2d0327c5f9d4142c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" Jan 27 05:56:19.378713 kubelet[2857]: E0127 05:56:19.377759 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-848cb947c-w2cj4_calico-apiserver(124b40a0-d1a3-4e06-b27d-7331549e3e87)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-848cb947c-w2cj4_calico-apiserver(124b40a0-d1a3-4e06-b27d-7331549e3e87)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"58fea9243e643e1162c64f81884b733a17894d3376e80d2c2d0327c5f9d4142c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:56:19.388595 containerd[1613]: time="2026-01-27T05:56:19.388551262Z" level=error msg="Failed to destroy network for sandbox \"879f2168687a72921cbcaa1fd3926ab8433a401b80d4a7c2e252045222aad706\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.412775 containerd[1613]: time="2026-01-27T05:56:19.412559562Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sf965,Uid:72441b5b-b7f3-4945-b29d-35dc24b39ff2,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"879f2168687a72921cbcaa1fd3926ab8433a401b80d4a7c2e252045222aad706\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.412991 kubelet[2857]: E0127 05:56:19.412886 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879f2168687a72921cbcaa1fd3926ab8433a401b80d4a7c2e252045222aad706\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.412991 kubelet[2857]: E0127 05:56:19.412956 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879f2168687a72921cbcaa1fd3926ab8433a401b80d4a7c2e252045222aad706\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sf965" Jan 27 05:56:19.412991 kubelet[2857]: E0127 05:56:19.412984 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879f2168687a72921cbcaa1fd3926ab8433a401b80d4a7c2e252045222aad706\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sf965" Jan 27 05:56:19.413176 kubelet[2857]: E0127 05:56:19.413039 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-sf965_kube-system(72441b5b-b7f3-4945-b29d-35dc24b39ff2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-sf965_kube-system(72441b5b-b7f3-4945-b29d-35dc24b39ff2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"879f2168687a72921cbcaa1fd3926ab8433a401b80d4a7c2e252045222aad706\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-sf965" podUID="72441b5b-b7f3-4945-b29d-35dc24b39ff2" Jan 27 05:56:19.440403 containerd[1613]: time="2026-01-27T05:56:19.439654115Z" level=error msg="Failed to destroy network for sandbox \"3be1ecaf3ad9aff09e4eb33c44e1c62017c1fba6a140de8002e6b7d0c51bad40\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.455180 containerd[1613]: time="2026-01-27T05:56:19.455121431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56c68bdb6-v2r5d,Uid:abcbc1d1-af63-4772-a8eb-6b5783d69e07,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3be1ecaf3ad9aff09e4eb33c44e1c62017c1fba6a140de8002e6b7d0c51bad40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.455480 kubelet[2857]: E0127 05:56:19.455429 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3be1ecaf3ad9aff09e4eb33c44e1c62017c1fba6a140de8002e6b7d0c51bad40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.455621 kubelet[2857]: E0127 05:56:19.455514 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3be1ecaf3ad9aff09e4eb33c44e1c62017c1fba6a140de8002e6b7d0c51bad40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" Jan 27 05:56:19.455621 kubelet[2857]: E0127 05:56:19.455545 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3be1ecaf3ad9aff09e4eb33c44e1c62017c1fba6a140de8002e6b7d0c51bad40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" Jan 27 05:56:19.456996 kubelet[2857]: E0127 05:56:19.456758 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-56c68bdb6-v2r5d_calico-system(abcbc1d1-af63-4772-a8eb-6b5783d69e07)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-56c68bdb6-v2r5d_calico-system(abcbc1d1-af63-4772-a8eb-6b5783d69e07)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3be1ecaf3ad9aff09e4eb33c44e1c62017c1fba6a140de8002e6b7d0c51bad40\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:56:19.506842 containerd[1613]: time="2026-01-27T05:56:19.506759484Z" level=error msg="Failed to destroy network for sandbox \"471ce59e906cc92d2c5776965bf11561652ecc9e563bb8919567217b9d8427bf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.511169 containerd[1613]: time="2026-01-27T05:56:19.511039287Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-xgb9b,Uid:e4b3467b-adcb-4738-9feb-ff8bcf1c33fe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"471ce59e906cc92d2c5776965bf11561652ecc9e563bb8919567217b9d8427bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.511659 kubelet[2857]: E0127 05:56:19.511599 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"471ce59e906cc92d2c5776965bf11561652ecc9e563bb8919567217b9d8427bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.511959 kubelet[2857]: E0127 05:56:19.511677 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"471ce59e906cc92d2c5776965bf11561652ecc9e563bb8919567217b9d8427bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" Jan 27 05:56:19.511959 kubelet[2857]: E0127 05:56:19.511708 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"471ce59e906cc92d2c5776965bf11561652ecc9e563bb8919567217b9d8427bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" Jan 27 05:56:19.511959 kubelet[2857]: E0127 05:56:19.511770 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-848cb947c-xgb9b_calico-apiserver(e4b3467b-adcb-4738-9feb-ff8bcf1c33fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-848cb947c-xgb9b_calico-apiserver(e4b3467b-adcb-4738-9feb-ff8bcf1c33fe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"471ce59e906cc92d2c5776965bf11561652ecc9e563bb8919567217b9d8427bf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:56:19.521779 containerd[1613]: time="2026-01-27T05:56:19.521735425Z" level=error msg="Failed to destroy network for sandbox \"3fca9612c66e35bc7feb85b4ea3fd0ada1b6de122c0c47c9a27083e3848a1341\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.525014 containerd[1613]: time="2026-01-27T05:56:19.524951916Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6r9xq,Uid:e77345b2-283b-459e-9e5a-f8eabb03a5ae,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fca9612c66e35bc7feb85b4ea3fd0ada1b6de122c0c47c9a27083e3848a1341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.526013 kubelet[2857]: E0127 05:56:19.525525 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fca9612c66e35bc7feb85b4ea3fd0ada1b6de122c0c47c9a27083e3848a1341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.526013 kubelet[2857]: E0127 05:56:19.525611 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fca9612c66e35bc7feb85b4ea3fd0ada1b6de122c0c47c9a27083e3848a1341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6r9xq" Jan 27 05:56:19.526013 kubelet[2857]: E0127 05:56:19.525645 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3fca9612c66e35bc7feb85b4ea3fd0ada1b6de122c0c47c9a27083e3848a1341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6r9xq" Jan 27 05:56:19.527263 containerd[1613]: time="2026-01-27T05:56:19.527226943Z" level=error msg="Failed to destroy network for sandbox \"645d98335e4a303192b2f59f5bedba47945f76da0e85281e071259e272aacc71\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.527682 kubelet[2857]: E0127 05:56:19.525722 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6r9xq_kube-system(e77345b2-283b-459e-9e5a-f8eabb03a5ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6r9xq_kube-system(e77345b2-283b-459e-9e5a-f8eabb03a5ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3fca9612c66e35bc7feb85b4ea3fd0ada1b6de122c0c47c9a27083e3848a1341\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6r9xq" podUID="e77345b2-283b-459e-9e5a-f8eabb03a5ae" Jan 27 05:56:19.531241 containerd[1613]: time="2026-01-27T05:56:19.531129908Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4mkcq,Uid:13975170-e4d2-41c8-9e0a-f42e4f517791,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"645d98335e4a303192b2f59f5bedba47945f76da0e85281e071259e272aacc71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.531632 containerd[1613]: time="2026-01-27T05:56:19.531596282Z" level=error msg="Failed to destroy network for sandbox \"d5d6a83debf7fc6c9c215b080306e11e39cd1bbcfa5a56be51c24aea6bddda58\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.531949 kubelet[2857]: E0127 05:56:19.531674 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"645d98335e4a303192b2f59f5bedba47945f76da0e85281e071259e272aacc71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.531949 kubelet[2857]: E0127 05:56:19.531758 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"645d98335e4a303192b2f59f5bedba47945f76da0e85281e071259e272aacc71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-4mkcq" Jan 27 05:56:19.531949 kubelet[2857]: E0127 05:56:19.531791 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"645d98335e4a303192b2f59f5bedba47945f76da0e85281e071259e272aacc71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-4mkcq" Jan 27 05:56:19.532892 kubelet[2857]: E0127 05:56:19.532650 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-4mkcq_calico-system(13975170-e4d2-41c8-9e0a-f42e4f517791)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-4mkcq_calico-system(13975170-e4d2-41c8-9e0a-f42e4f517791)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"645d98335e4a303192b2f59f5bedba47945f76da0e85281e071259e272aacc71\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:56:19.535589 containerd[1613]: time="2026-01-27T05:56:19.535528521Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c96655cb7-tj5vx,Uid:1d6104d6-b194-464e-a9ff-614dd70f00c4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5d6a83debf7fc6c9c215b080306e11e39cd1bbcfa5a56be51c24aea6bddda58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.536074 kubelet[2857]: E0127 05:56:19.535950 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5d6a83debf7fc6c9c215b080306e11e39cd1bbcfa5a56be51c24aea6bddda58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.536299 kubelet[2857]: E0127 05:56:19.536043 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5d6a83debf7fc6c9c215b080306e11e39cd1bbcfa5a56be51c24aea6bddda58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7c96655cb7-tj5vx" Jan 27 05:56:19.536299 kubelet[2857]: E0127 05:56:19.536197 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5d6a83debf7fc6c9c215b080306e11e39cd1bbcfa5a56be51c24aea6bddda58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7c96655cb7-tj5vx" Jan 27 05:56:19.536836 kubelet[2857]: E0127 05:56:19.536645 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7c96655cb7-tj5vx_calico-system(1d6104d6-b194-464e-a9ff-614dd70f00c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7c96655cb7-tj5vx_calico-system(1d6104d6-b194-464e-a9ff-614dd70f00c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d5d6a83debf7fc6c9c215b080306e11e39cd1bbcfa5a56be51c24aea6bddda58\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7c96655cb7-tj5vx" podUID="1d6104d6-b194-464e-a9ff-614dd70f00c4" Jan 27 05:56:19.859850 systemd[1]: Created slice kubepods-besteffort-pod8f740da6_d731_4b30_bf8e_ada1ccd8b61b.slice - libcontainer container kubepods-besteffort-pod8f740da6_d731_4b30_bf8e_ada1ccd8b61b.slice. Jan 27 05:56:19.864314 containerd[1613]: time="2026-01-27T05:56:19.864265483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s7bgd,Uid:8f740da6-d731-4b30-bf8e-ada1ccd8b61b,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:19.936074 containerd[1613]: time="2026-01-27T05:56:19.936014832Z" level=error msg="Failed to destroy network for sandbox \"286a89e347162aa5e7c8e21bdd27f8581a364057fdb69cd00a6bc20488d1f7d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.940906 containerd[1613]: time="2026-01-27T05:56:19.940839232Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s7bgd,Uid:8f740da6-d731-4b30-bf8e-ada1ccd8b61b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"286a89e347162aa5e7c8e21bdd27f8581a364057fdb69cd00a6bc20488d1f7d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.941103 systemd[1]: run-netns-cni\x2dba77dc44\x2d41ba\x2d6f9b\x2df3f6\x2daea8dca9bc84.mount: Deactivated successfully. Jan 27 05:56:19.941763 kubelet[2857]: E0127 05:56:19.941584 2857 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"286a89e347162aa5e7c8e21bdd27f8581a364057fdb69cd00a6bc20488d1f7d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 05:56:19.941763 kubelet[2857]: E0127 05:56:19.941659 2857 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"286a89e347162aa5e7c8e21bdd27f8581a364057fdb69cd00a6bc20488d1f7d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s7bgd" Jan 27 05:56:19.941763 kubelet[2857]: E0127 05:56:19.941694 2857 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"286a89e347162aa5e7c8e21bdd27f8581a364057fdb69cd00a6bc20488d1f7d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s7bgd" Jan 27 05:56:19.943253 kubelet[2857]: E0127 05:56:19.942805 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"286a89e347162aa5e7c8e21bdd27f8581a364057fdb69cd00a6bc20488d1f7d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:20.064528 containerd[1613]: time="2026-01-27T05:56:20.064228140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 27 05:56:26.436200 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount331706078.mount: Deactivated successfully. Jan 27 05:56:26.468179 containerd[1613]: time="2026-01-27T05:56:26.468109335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:26.469725 containerd[1613]: time="2026-01-27T05:56:26.469523126Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 27 05:56:26.471098 containerd[1613]: time="2026-01-27T05:56:26.471057700Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:26.473884 containerd[1613]: time="2026-01-27T05:56:26.473849780Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 05:56:26.475387 containerd[1613]: time="2026-01-27T05:56:26.475133745Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 6.410357357s" Jan 27 05:56:26.475387 containerd[1613]: time="2026-01-27T05:56:26.475187875Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 27 05:56:26.500785 containerd[1613]: time="2026-01-27T05:56:26.500738969Z" level=info msg="CreateContainer within sandbox \"16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 27 05:56:26.516401 containerd[1613]: time="2026-01-27T05:56:26.514583562Z" level=info msg="Container 83dcf11ab97dfe1d9bd14c9afdb052c5f19d54f684a29d1c240e97b53c163368: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:56:26.528910 containerd[1613]: time="2026-01-27T05:56:26.528861763Z" level=info msg="CreateContainer within sandbox \"16025e7c17bed68937b8b26d85d373466d71e33b7c6e0ba947ed3003e848e6db\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"83dcf11ab97dfe1d9bd14c9afdb052c5f19d54f684a29d1c240e97b53c163368\"" Jan 27 05:56:26.530129 containerd[1613]: time="2026-01-27T05:56:26.530095746Z" level=info msg="StartContainer for \"83dcf11ab97dfe1d9bd14c9afdb052c5f19d54f684a29d1c240e97b53c163368\"" Jan 27 05:56:26.532735 containerd[1613]: time="2026-01-27T05:56:26.532382085Z" level=info msg="connecting to shim 83dcf11ab97dfe1d9bd14c9afdb052c5f19d54f684a29d1c240e97b53c163368" address="unix:///run/containerd/s/1ebd1c124047223b7dabc5657e1e41046e1afccb042a3b79219002054e8166d8" protocol=ttrpc version=3 Jan 27 05:56:26.566710 systemd[1]: Started cri-containerd-83dcf11ab97dfe1d9bd14c9afdb052c5f19d54f684a29d1c240e97b53c163368.scope - libcontainer container 83dcf11ab97dfe1d9bd14c9afdb052c5f19d54f684a29d1c240e97b53c163368. Jan 27 05:56:26.643000 audit: BPF prog-id=173 op=LOAD Jan 27 05:56:26.650830 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 27 05:56:26.650971 kernel: audit: type=1334 audit(1769493386.643:558): prog-id=173 op=LOAD Jan 27 05:56:26.643000 audit[3943]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.687089 kernel: audit: type=1300 audit(1769493386.643:558): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.643000 audit: BPF prog-id=174 op=LOAD Jan 27 05:56:26.723983 kernel: audit: type=1327 audit(1769493386.643:558): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.724379 kernel: audit: type=1334 audit(1769493386.643:559): prog-id=174 op=LOAD Jan 27 05:56:26.724430 kernel: audit: type=1300 audit(1769493386.643:559): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.643000 audit[3943]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.782056 kernel: audit: type=1327 audit(1769493386.643:559): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.782273 kernel: audit: type=1334 audit(1769493386.643:560): prog-id=174 op=UNLOAD Jan 27 05:56:26.643000 audit: BPF prog-id=174 op=UNLOAD Jan 27 05:56:26.818045 kernel: audit: type=1300 audit(1769493386.643:560): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.643000 audit[3943]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.854720 kernel: audit: type=1327 audit(1769493386.643:560): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.854854 kernel: audit: type=1334 audit(1769493386.643:561): prog-id=173 op=UNLOAD Jan 27 05:56:26.643000 audit: BPF prog-id=173 op=UNLOAD Jan 27 05:56:26.643000 audit[3943]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.643000 audit: BPF prog-id=175 op=LOAD Jan 27 05:56:26.643000 audit[3943]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3468 pid=3943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:26.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833646366313161623937646665316439626431346339616664623035 Jan 27 05:56:26.864034 containerd[1613]: time="2026-01-27T05:56:26.863968033Z" level=info msg="StartContainer for \"83dcf11ab97dfe1d9bd14c9afdb052c5f19d54f684a29d1c240e97b53c163368\" returns successfully" Jan 27 05:56:26.958569 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 27 05:56:26.958712 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 27 05:56:27.162906 kubelet[2857]: I0127 05:56:27.162681 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vk4rp" podStartSLOduration=3.126276668 podStartE2EDuration="20.162631305s" podCreationTimestamp="2026-01-27 05:56:07 +0000 UTC" firstStartedPulling="2026-01-27 05:56:09.440120352 +0000 UTC m=+24.785121662" lastFinishedPulling="2026-01-27 05:56:26.47647499 +0000 UTC m=+41.821476299" observedRunningTime="2026-01-27 05:56:27.160608205 +0000 UTC m=+42.505609526" watchObservedRunningTime="2026-01-27 05:56:27.162631305 +0000 UTC m=+42.507632625" Jan 27 05:56:27.235164 kubelet[2857]: I0127 05:56:27.235109 2857 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-backend-key-pair\") pod \"1d6104d6-b194-464e-a9ff-614dd70f00c4\" (UID: \"1d6104d6-b194-464e-a9ff-614dd70f00c4\") " Jan 27 05:56:27.235379 kubelet[2857]: I0127 05:56:27.235181 2857 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtr8q\" (UniqueName: \"kubernetes.io/projected/1d6104d6-b194-464e-a9ff-614dd70f00c4-kube-api-access-rtr8q\") pod \"1d6104d6-b194-464e-a9ff-614dd70f00c4\" (UID: \"1d6104d6-b194-464e-a9ff-614dd70f00c4\") " Jan 27 05:56:27.235379 kubelet[2857]: I0127 05:56:27.235221 2857 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-ca-bundle\") pod \"1d6104d6-b194-464e-a9ff-614dd70f00c4\" (UID: \"1d6104d6-b194-464e-a9ff-614dd70f00c4\") " Jan 27 05:56:27.237495 kubelet[2857]: I0127 05:56:27.237002 2857 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "1d6104d6-b194-464e-a9ff-614dd70f00c4" (UID: "1d6104d6-b194-464e-a9ff-614dd70f00c4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 27 05:56:27.245394 kubelet[2857]: I0127 05:56:27.244668 2857 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "1d6104d6-b194-464e-a9ff-614dd70f00c4" (UID: "1d6104d6-b194-464e-a9ff-614dd70f00c4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 27 05:56:27.246862 kubelet[2857]: I0127 05:56:27.246819 2857 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6104d6-b194-464e-a9ff-614dd70f00c4-kube-api-access-rtr8q" (OuterVolumeSpecName: "kube-api-access-rtr8q") pod "1d6104d6-b194-464e-a9ff-614dd70f00c4" (UID: "1d6104d6-b194-464e-a9ff-614dd70f00c4"). InnerVolumeSpecName "kube-api-access-rtr8q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 27 05:56:27.337134 kubelet[2857]: I0127 05:56:27.337087 2857 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-backend-key-pair\") on node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" DevicePath \"\"" Jan 27 05:56:27.337134 kubelet[2857]: I0127 05:56:27.337133 2857 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtr8q\" (UniqueName: \"kubernetes.io/projected/1d6104d6-b194-464e-a9ff-614dd70f00c4-kube-api-access-rtr8q\") on node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" DevicePath \"\"" Jan 27 05:56:27.337134 kubelet[2857]: I0127 05:56:27.337152 2857 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6104d6-b194-464e-a9ff-614dd70f00c4-whisker-ca-bundle\") on node \"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf\" DevicePath \"\"" Jan 27 05:56:27.443329 systemd[1]: var-lib-kubelet-pods-1d6104d6\x2db194\x2d464e\x2da9ff\x2d614dd70f00c4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drtr8q.mount: Deactivated successfully. Jan 27 05:56:27.444071 systemd[1]: var-lib-kubelet-pods-1d6104d6\x2db194\x2d464e\x2da9ff\x2d614dd70f00c4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 27 05:56:28.107639 systemd[1]: Removed slice kubepods-besteffort-pod1d6104d6_b194_464e_a9ff_614dd70f00c4.slice - libcontainer container kubepods-besteffort-pod1d6104d6_b194_464e_a9ff_614dd70f00c4.slice. Jan 27 05:56:28.206622 systemd[1]: Created slice kubepods-besteffort-pod2da61fce_ec56_409e_b213_304528221d28.slice - libcontainer container kubepods-besteffort-pod2da61fce_ec56_409e_b213_304528221d28.slice. Jan 27 05:56:28.350311 kubelet[2857]: I0127 05:56:28.350218 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2da61fce-ec56-409e-b213-304528221d28-whisker-backend-key-pair\") pod \"whisker-8b7df678c-rdlpv\" (UID: \"2da61fce-ec56-409e-b213-304528221d28\") " pod="calico-system/whisker-8b7df678c-rdlpv" Jan 27 05:56:28.350311 kubelet[2857]: I0127 05:56:28.350313 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdl7k\" (UniqueName: \"kubernetes.io/projected/2da61fce-ec56-409e-b213-304528221d28-kube-api-access-vdl7k\") pod \"whisker-8b7df678c-rdlpv\" (UID: \"2da61fce-ec56-409e-b213-304528221d28\") " pod="calico-system/whisker-8b7df678c-rdlpv" Jan 27 05:56:28.351079 kubelet[2857]: I0127 05:56:28.350342 2857 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da61fce-ec56-409e-b213-304528221d28-whisker-ca-bundle\") pod \"whisker-8b7df678c-rdlpv\" (UID: \"2da61fce-ec56-409e-b213-304528221d28\") " pod="calico-system/whisker-8b7df678c-rdlpv" Jan 27 05:56:28.514660 containerd[1613]: time="2026-01-27T05:56:28.514529185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8b7df678c-rdlpv,Uid:2da61fce-ec56-409e-b213-304528221d28,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:28.763007 systemd-networkd[1499]: calid3de08f22d4: Link UP Jan 27 05:56:28.763434 systemd-networkd[1499]: calid3de08f22d4: Gained carrier Jan 27 05:56:28.801222 containerd[1613]: 2026-01-27 05:56:28.575 [INFO][4122] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 27 05:56:28.801222 containerd[1613]: 2026-01-27 05:56:28.607 [INFO][4122] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0 whisker-8b7df678c- calico-system 2da61fce-ec56-409e-b213-304528221d28 889 0 2026-01-27 05:56:28 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:8b7df678c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf whisker-8b7df678c-rdlpv eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid3de08f22d4 [] [] }} ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-" Jan 27 05:56:28.801222 containerd[1613]: 2026-01-27 05:56:28.607 [INFO][4122] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" Jan 27 05:56:28.801222 containerd[1613]: 2026-01-27 05:56:28.687 [INFO][4175] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" HandleID="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.689 [INFO][4175] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" HandleID="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e2690), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"whisker-8b7df678c-rdlpv", "timestamp":"2026-01-27 05:56:28.687423826 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.689 [INFO][4175] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.689 [INFO][4175] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.689 [INFO][4175] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.700 [INFO][4175] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.708 [INFO][4175] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.714 [INFO][4175] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802600 containerd[1613]: 2026-01-27 05:56:28.717 [INFO][4175] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.720 [INFO][4175] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.720 [INFO][4175] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.722 [INFO][4175] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.729 [INFO][4175] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.735 [INFO][4175] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.1/26] block=192.168.79.0/26 handle="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.735 [INFO][4175] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.1/26] handle="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.736 [INFO][4175] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:28.802995 containerd[1613]: 2026-01-27 05:56:28.736 [INFO][4175] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.1/26] IPv6=[] ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" HandleID="k8s-pod-network.aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" Jan 27 05:56:28.805807 containerd[1613]: 2026-01-27 05:56:28.744 [INFO][4122] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0", GenerateName:"whisker-8b7df678c-", Namespace:"calico-system", SelfLink:"", UID:"2da61fce-ec56-409e-b213-304528221d28", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"8b7df678c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"whisker-8b7df678c-rdlpv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.79.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid3de08f22d4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:28.805927 containerd[1613]: 2026-01-27 05:56:28.744 [INFO][4122] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.1/32] ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" Jan 27 05:56:28.805927 containerd[1613]: 2026-01-27 05:56:28.744 [INFO][4122] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid3de08f22d4 ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" Jan 27 05:56:28.805927 containerd[1613]: 2026-01-27 05:56:28.767 [INFO][4122] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" Jan 27 05:56:28.806077 containerd[1613]: 2026-01-27 05:56:28.771 [INFO][4122] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0", GenerateName:"whisker-8b7df678c-", Namespace:"calico-system", SelfLink:"", UID:"2da61fce-ec56-409e-b213-304528221d28", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"8b7df678c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe", Pod:"whisker-8b7df678c-rdlpv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.79.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid3de08f22d4", MAC:"76:8e:f7:4b:7e:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:28.806194 containerd[1613]: 2026-01-27 05:56:28.796 [INFO][4122] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" Namespace="calico-system" Pod="whisker-8b7df678c-rdlpv" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-whisker--8b7df678c--rdlpv-eth0" Jan 27 05:56:28.860642 containerd[1613]: time="2026-01-27T05:56:28.860583837Z" level=info msg="connecting to shim aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe" address="unix:///run/containerd/s/dc544a1e9f3cb10d3a71d8d81388009aa0cdddf85ff541c614b63b27f7ec32ec" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:28.867041 kubelet[2857]: I0127 05:56:28.866434 2857 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6104d6-b194-464e-a9ff-614dd70f00c4" path="/var/lib/kubelet/pods/1d6104d6-b194-464e-a9ff-614dd70f00c4/volumes" Jan 27 05:56:28.938548 systemd[1]: Started cri-containerd-aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe.scope - libcontainer container aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe. Jan 27 05:56:28.989000 audit: BPF prog-id=176 op=LOAD Jan 27 05:56:28.990000 audit: BPF prog-id=177 op=LOAD Jan 27 05:56:28.990000 audit[4213]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4202 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:28.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161653332313064363666663930306637323664376230353932336135 Jan 27 05:56:28.991000 audit: BPF prog-id=177 op=UNLOAD Jan 27 05:56:28.991000 audit[4213]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4202 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:28.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161653332313064363666663930306637323664376230353932336135 Jan 27 05:56:28.992000 audit: BPF prog-id=178 op=LOAD Jan 27 05:56:28.992000 audit[4213]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4202 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:28.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161653332313064363666663930306637323664376230353932336135 Jan 27 05:56:28.992000 audit: BPF prog-id=179 op=LOAD Jan 27 05:56:28.992000 audit[4213]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4202 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:28.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161653332313064363666663930306637323664376230353932336135 Jan 27 05:56:28.992000 audit: BPF prog-id=179 op=UNLOAD Jan 27 05:56:28.992000 audit[4213]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4202 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:28.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161653332313064363666663930306637323664376230353932336135 Jan 27 05:56:28.992000 audit: BPF prog-id=178 op=UNLOAD Jan 27 05:56:28.992000 audit[4213]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4202 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:28.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161653332313064363666663930306637323664376230353932336135 Jan 27 05:56:28.992000 audit: BPF prog-id=180 op=LOAD Jan 27 05:56:28.992000 audit[4213]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4202 pid=4213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:28.992000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161653332313064363666663930306637323664376230353932336135 Jan 27 05:56:29.118596 containerd[1613]: time="2026-01-27T05:56:29.115998923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8b7df678c-rdlpv,Uid:2da61fce-ec56-409e-b213-304528221d28,Namespace:calico-system,Attempt:0,} returns sandbox id \"aae3210d66ff900f726d7b05923a506dbedea00fc1fb47b38e2ab90ca6dddcbe\"" Jan 27 05:56:29.121975 containerd[1613]: time="2026-01-27T05:56:29.121423285Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 05:56:29.262000 audit: BPF prog-id=181 op=LOAD Jan 27 05:56:29.262000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff0a8c6280 a2=98 a3=1fffffffffffffff items=0 ppid=4106 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.262000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 05:56:29.263000 audit: BPF prog-id=181 op=UNLOAD Jan 27 05:56:29.263000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff0a8c6250 a3=0 items=0 ppid=4106 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.263000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 05:56:29.263000 audit: BPF prog-id=182 op=LOAD Jan 27 05:56:29.263000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff0a8c6160 a2=94 a3=3 items=0 ppid=4106 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.263000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 05:56:29.263000 audit: BPF prog-id=182 op=UNLOAD Jan 27 05:56:29.263000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff0a8c6160 a2=94 a3=3 items=0 ppid=4106 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.263000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 05:56:29.263000 audit: BPF prog-id=183 op=LOAD Jan 27 05:56:29.263000 audit[4274]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff0a8c61a0 a2=94 a3=7fff0a8c6380 items=0 ppid=4106 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.263000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 05:56:29.264000 audit: BPF prog-id=183 op=UNLOAD Jan 27 05:56:29.264000 audit[4274]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff0a8c61a0 a2=94 a3=7fff0a8c6380 items=0 ppid=4106 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.264000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 05:56:29.267000 audit: BPF prog-id=184 op=LOAD Jan 27 05:56:29.267000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffcdc659c0 a2=98 a3=3 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.267000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.267000 audit: BPF prog-id=184 op=UNLOAD Jan 27 05:56:29.267000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fffcdc65990 a3=0 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.267000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.268000 audit: BPF prog-id=185 op=LOAD Jan 27 05:56:29.268000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fffcdc657b0 a2=94 a3=54428f items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.268000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.268000 audit: BPF prog-id=185 op=UNLOAD Jan 27 05:56:29.268000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fffcdc657b0 a2=94 a3=54428f items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.268000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.269000 audit: BPF prog-id=186 op=LOAD Jan 27 05:56:29.269000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fffcdc657e0 a2=94 a3=2 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.269000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.269000 audit: BPF prog-id=186 op=UNLOAD Jan 27 05:56:29.269000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fffcdc657e0 a2=0 a3=2 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.269000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.282554 containerd[1613]: time="2026-01-27T05:56:29.282496657Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:29.284395 containerd[1613]: time="2026-01-27T05:56:29.284336157Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:29.284510 containerd[1613]: time="2026-01-27T05:56:29.284395917Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 05:56:29.284969 kubelet[2857]: E0127 05:56:29.284862 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:56:29.284969 kubelet[2857]: E0127 05:56:29.284936 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:56:29.285611 kubelet[2857]: E0127 05:56:29.285543 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aaa555957aa46189e6920dc1ced80c6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:29.289544 containerd[1613]: time="2026-01-27T05:56:29.289130613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 05:56:29.457672 containerd[1613]: time="2026-01-27T05:56:29.457624994Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:29.459543 containerd[1613]: time="2026-01-27T05:56:29.459391850Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 05:56:29.460500 containerd[1613]: time="2026-01-27T05:56:29.459823015Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:29.460642 kubelet[2857]: E0127 05:56:29.460016 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:56:29.460642 kubelet[2857]: E0127 05:56:29.460075 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:56:29.462811 kubelet[2857]: E0127 05:56:29.460323 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:29.462811 kubelet[2857]: E0127 05:56:29.462723 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:56:29.489000 audit: BPF prog-id=187 op=LOAD Jan 27 05:56:29.489000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fffcdc656a0 a2=94 a3=1 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.489000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.489000 audit: BPF prog-id=187 op=UNLOAD Jan 27 05:56:29.489000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fffcdc656a0 a2=94 a3=1 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.489000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.503000 audit: BPF prog-id=188 op=LOAD Jan 27 05:56:29.503000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fffcdc65690 a2=94 a3=4 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.503000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.504000 audit: BPF prog-id=188 op=UNLOAD Jan 27 05:56:29.504000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fffcdc65690 a2=0 a3=4 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.504000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.504000 audit: BPF prog-id=189 op=LOAD Jan 27 05:56:29.504000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fffcdc654f0 a2=94 a3=5 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.504000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.504000 audit: BPF prog-id=189 op=UNLOAD Jan 27 05:56:29.504000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fffcdc654f0 a2=0 a3=5 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.504000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.504000 audit: BPF prog-id=190 op=LOAD Jan 27 05:56:29.504000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fffcdc65710 a2=94 a3=6 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.504000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.504000 audit: BPF prog-id=190 op=UNLOAD Jan 27 05:56:29.504000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fffcdc65710 a2=0 a3=6 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.504000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.505000 audit: BPF prog-id=191 op=LOAD Jan 27 05:56:29.505000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fffcdc64ec0 a2=94 a3=88 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.505000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.505000 audit: BPF prog-id=192 op=LOAD Jan 27 05:56:29.505000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7fffcdc64d40 a2=94 a3=2 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.505000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.505000 audit: BPF prog-id=192 op=UNLOAD Jan 27 05:56:29.505000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7fffcdc64d70 a2=0 a3=7fffcdc64e70 items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.505000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.505000 audit: BPF prog-id=191 op=UNLOAD Jan 27 05:56:29.505000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=1680d10 a2=0 a3=484fc674b0a4a86a items=0 ppid=4106 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.505000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 05:56:29.518000 audit: BPF prog-id=193 op=LOAD Jan 27 05:56:29.518000 audit[4278]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc675f8100 a2=98 a3=1999999999999999 items=0 ppid=4106 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.518000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 05:56:29.518000 audit: BPF prog-id=193 op=UNLOAD Jan 27 05:56:29.518000 audit[4278]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc675f80d0 a3=0 items=0 ppid=4106 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.518000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 05:56:29.518000 audit: BPF prog-id=194 op=LOAD Jan 27 05:56:29.518000 audit[4278]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc675f7fe0 a2=94 a3=ffff items=0 ppid=4106 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.518000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 05:56:29.518000 audit: BPF prog-id=194 op=UNLOAD Jan 27 05:56:29.518000 audit[4278]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc675f7fe0 a2=94 a3=ffff items=0 ppid=4106 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.518000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 05:56:29.518000 audit: BPF prog-id=195 op=LOAD Jan 27 05:56:29.518000 audit[4278]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc675f8020 a2=94 a3=7ffc675f8200 items=0 ppid=4106 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.518000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 05:56:29.518000 audit: BPF prog-id=195 op=UNLOAD Jan 27 05:56:29.518000 audit[4278]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc675f8020 a2=94 a3=7ffc675f8200 items=0 ppid=4106 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.518000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 05:56:29.615343 systemd-networkd[1499]: vxlan.calico: Link UP Jan 27 05:56:29.615355 systemd-networkd[1499]: vxlan.calico: Gained carrier Jan 27 05:56:29.657000 audit: BPF prog-id=196 op=LOAD Jan 27 05:56:29.657000 audit[4304]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe8dfda200 a2=98 a3=0 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.657000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.658000 audit: BPF prog-id=196 op=UNLOAD Jan 27 05:56:29.658000 audit[4304]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe8dfda1d0 a3=0 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.658000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=197 op=LOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe8dfda010 a2=94 a3=54428f items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=197 op=UNLOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe8dfda010 a2=94 a3=54428f items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=198 op=LOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe8dfda040 a2=94 a3=2 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=198 op=UNLOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe8dfda040 a2=0 a3=2 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=199 op=LOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe8dfd9df0 a2=94 a3=4 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=199 op=UNLOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe8dfd9df0 a2=94 a3=4 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=200 op=LOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe8dfd9ef0 a2=94 a3=7ffe8dfda070 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.659000 audit: BPF prog-id=200 op=UNLOAD Jan 27 05:56:29.659000 audit[4304]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe8dfd9ef0 a2=0 a3=7ffe8dfda070 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.659000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.662000 audit: BPF prog-id=201 op=LOAD Jan 27 05:56:29.662000 audit[4304]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe8dfd9620 a2=94 a3=2 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.662000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.662000 audit: BPF prog-id=201 op=UNLOAD Jan 27 05:56:29.662000 audit[4304]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe8dfd9620 a2=0 a3=2 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.662000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.662000 audit: BPF prog-id=202 op=LOAD Jan 27 05:56:29.662000 audit[4304]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe8dfd9720 a2=94 a3=30 items=0 ppid=4106 pid=4304 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.662000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 05:56:29.675000 audit: BPF prog-id=203 op=LOAD Jan 27 05:56:29.675000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffef72df760 a2=98 a3=0 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.675000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.675000 audit: BPF prog-id=203 op=UNLOAD Jan 27 05:56:29.675000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffef72df730 a3=0 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.675000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.675000 audit: BPF prog-id=204 op=LOAD Jan 27 05:56:29.675000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffef72df550 a2=94 a3=54428f items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.675000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.675000 audit: BPF prog-id=204 op=UNLOAD Jan 27 05:56:29.675000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffef72df550 a2=94 a3=54428f items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.675000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.675000 audit: BPF prog-id=205 op=LOAD Jan 27 05:56:29.675000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffef72df580 a2=94 a3=2 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.675000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.681000 audit: BPF prog-id=205 op=UNLOAD Jan 27 05:56:29.681000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffef72df580 a2=0 a3=2 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.681000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.884000 audit: BPF prog-id=206 op=LOAD Jan 27 05:56:29.884000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffef72df440 a2=94 a3=1 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.884000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.884000 audit: BPF prog-id=206 op=UNLOAD Jan 27 05:56:29.884000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffef72df440 a2=94 a3=1 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.884000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.900000 audit: BPF prog-id=207 op=LOAD Jan 27 05:56:29.900000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffef72df430 a2=94 a3=4 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.900000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.901000 audit: BPF prog-id=207 op=UNLOAD Jan 27 05:56:29.901000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffef72df430 a2=0 a3=4 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.901000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.901000 audit: BPF prog-id=208 op=LOAD Jan 27 05:56:29.901000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffef72df290 a2=94 a3=5 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.901000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.901000 audit: BPF prog-id=208 op=UNLOAD Jan 27 05:56:29.901000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffef72df290 a2=0 a3=5 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.901000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.901000 audit: BPF prog-id=209 op=LOAD Jan 27 05:56:29.901000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffef72df4b0 a2=94 a3=6 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.901000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.901000 audit: BPF prog-id=209 op=UNLOAD Jan 27 05:56:29.901000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffef72df4b0 a2=0 a3=6 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.901000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.902000 audit: BPF prog-id=210 op=LOAD Jan 27 05:56:29.902000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffef72dec60 a2=94 a3=88 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.902000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.902000 audit: BPF prog-id=211 op=LOAD Jan 27 05:56:29.902000 audit[4308]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffef72deae0 a2=94 a3=2 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.902000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.902000 audit: BPF prog-id=211 op=UNLOAD Jan 27 05:56:29.902000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffef72deb10 a2=0 a3=7ffef72dec10 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.902000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.903000 audit: BPF prog-id=210 op=UNLOAD Jan 27 05:56:29.903000 audit[4308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=2344cd10 a2=0 a3=c156aab031497706 items=0 ppid=4106 pid=4308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.903000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 05:56:29.910000 audit: BPF prog-id=202 op=UNLOAD Jan 27 05:56:29.910000 audit[4106]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c000452780 a2=0 a3=0 items=0 ppid=4082 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.910000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 27 05:56:29.982000 audit[4332]: NETFILTER_CFG table=mangle:121 family=2 entries=16 op=nft_register_chain pid=4332 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:29.982000 audit[4332]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffe52dd8fa0 a2=0 a3=7ffe52dd8f8c items=0 ppid=4106 pid=4332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.982000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:29.992000 audit[4334]: NETFILTER_CFG table=nat:122 family=2 entries=15 op=nft_register_chain pid=4334 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:29.992000 audit[4334]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffc2a721820 a2=0 a3=7ffc2a72180c items=0 ppid=4106 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:29.992000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:30.001000 audit[4331]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4331 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:30.001000 audit[4331]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffc9d762900 a2=0 a3=7ffc9d7628ec items=0 ppid=4106 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:30.001000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:30.005000 audit[4336]: NETFILTER_CFG table=filter:124 family=2 entries=94 op=nft_register_chain pid=4336 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:30.005000 audit[4336]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7fffdbe114a0 a2=0 a3=7fffdbe1148c items=0 ppid=4106 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:30.005000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:30.109633 kubelet[2857]: E0127 05:56:30.109526 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:56:30.138000 audit[4345]: NETFILTER_CFG table=filter:125 family=2 entries=20 op=nft_register_rule pid=4345 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:30.138000 audit[4345]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd871efd60 a2=0 a3=7ffd871efd4c items=0 ppid=2979 pid=4345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:30.138000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:30.146000 audit[4345]: NETFILTER_CFG table=nat:126 family=2 entries=14 op=nft_register_rule pid=4345 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:30.146000 audit[4345]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffd871efd60 a2=0 a3=0 items=0 ppid=2979 pid=4345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:30.146000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:30.456252 systemd-networkd[1499]: calid3de08f22d4: Gained IPv6LL Jan 27 05:56:31.032499 systemd-networkd[1499]: vxlan.calico: Gained IPv6LL Jan 27 05:56:31.112962 kubelet[2857]: E0127 05:56:31.112838 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:56:31.850999 containerd[1613]: time="2026-01-27T05:56:31.850757756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-w2cj4,Uid:124b40a0-d1a3-4e06-b27d-7331549e3e87,Namespace:calico-apiserver,Attempt:0,}" Jan 27 05:56:31.850999 containerd[1613]: time="2026-01-27T05:56:31.850757760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6r9xq,Uid:e77345b2-283b-459e-9e5a-f8eabb03a5ae,Namespace:kube-system,Attempt:0,}" Jan 27 05:56:32.106636 systemd-networkd[1499]: calia8333c7bfc3: Link UP Jan 27 05:56:32.109546 systemd-networkd[1499]: calia8333c7bfc3: Gained carrier Jan 27 05:56:32.143719 containerd[1613]: 2026-01-27 05:56:31.959 [INFO][4350] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0 coredns-668d6bf9bc- kube-system e77345b2-283b-459e-9e5a-f8eabb03a5ae 811 0 2026-01-27 05:55:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf coredns-668d6bf9bc-6r9xq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia8333c7bfc3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-" Jan 27 05:56:32.143719 containerd[1613]: 2026-01-27 05:56:31.960 [INFO][4350] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" Jan 27 05:56:32.143719 containerd[1613]: 2026-01-27 05:56:32.036 [INFO][4372] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" HandleID="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.036 [INFO][4372] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" HandleID="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"coredns-668d6bf9bc-6r9xq", "timestamp":"2026-01-27 05:56:32.036157592 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.036 [INFO][4372] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.037 [INFO][4372] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.037 [INFO][4372] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.053 [INFO][4372] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.062 [INFO][4372] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.068 [INFO][4372] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.144057 containerd[1613]: 2026-01-27 05:56:32.070 [INFO][4372] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.074 [INFO][4372] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.075 [INFO][4372] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.077 [INFO][4372] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4 Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.085 [INFO][4372] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.093 [INFO][4372] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.2/26] block=192.168.79.0/26 handle="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.093 [INFO][4372] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.2/26] handle="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.093 [INFO][4372] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:32.146489 containerd[1613]: 2026-01-27 05:56:32.094 [INFO][4372] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.2/26] IPv6=[] ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" HandleID="k8s-pod-network.91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" Jan 27 05:56:32.147741 containerd[1613]: 2026-01-27 05:56:32.099 [INFO][4350] cni-plugin/k8s.go 418: Populated endpoint ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e77345b2-283b-459e-9e5a-f8eabb03a5ae", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 55, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"coredns-668d6bf9bc-6r9xq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia8333c7bfc3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:32.147741 containerd[1613]: 2026-01-27 05:56:32.099 [INFO][4350] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.2/32] ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" Jan 27 05:56:32.147741 containerd[1613]: 2026-01-27 05:56:32.100 [INFO][4350] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia8333c7bfc3 ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" Jan 27 05:56:32.147741 containerd[1613]: 2026-01-27 05:56:32.111 [INFO][4350] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" Jan 27 05:56:32.147741 containerd[1613]: 2026-01-27 05:56:32.116 [INFO][4350] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e77345b2-283b-459e-9e5a-f8eabb03a5ae", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 55, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4", Pod:"coredns-668d6bf9bc-6r9xq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia8333c7bfc3", MAC:"ce:47:d5:18:db:a8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:32.147741 containerd[1613]: 2026-01-27 05:56:32.136 [INFO][4350] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6r9xq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--6r9xq-eth0" Jan 27 05:56:32.220346 containerd[1613]: time="2026-01-27T05:56:32.219862344Z" level=info msg="connecting to shim 91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4" address="unix:///run/containerd/s/9288b8287f9ad24ad1c6ffe34548fe99aa83c253a8a1b5713baf5f559b9969b9" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:32.252643 systemd-networkd[1499]: cali28a74c36ee8: Link UP Jan 27 05:56:32.259656 systemd-networkd[1499]: cali28a74c36ee8: Gained carrier Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:31.975 [INFO][4348] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0 calico-apiserver-848cb947c- calico-apiserver 124b40a0-d1a3-4e06-b27d-7331549e3e87 820 0 2026-01-27 05:56:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:848cb947c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf calico-apiserver-848cb947c-w2cj4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali28a74c36ee8 [] [] }} ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:31.975 [INFO][4348] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.063 [INFO][4377] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" HandleID="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.063 [INFO][4377] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" HandleID="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fae0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"calico-apiserver-848cb947c-w2cj4", "timestamp":"2026-01-27 05:56:32.063144907 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.063 [INFO][4377] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.094 [INFO][4377] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.094 [INFO][4377] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.158 [INFO][4377] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.171 [INFO][4377] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.186 [INFO][4377] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.191 [INFO][4377] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.197 [INFO][4377] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.197 [INFO][4377] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.202 [INFO][4377] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.219 [INFO][4377] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.231 [INFO][4377] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.3/26] block=192.168.79.0/26 handle="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.231 [INFO][4377] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.3/26] handle="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.231 [INFO][4377] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:32.307178 containerd[1613]: 2026-01-27 05:56:32.231 [INFO][4377] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.3/26] IPv6=[] ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" HandleID="k8s-pod-network.bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" Jan 27 05:56:32.309150 containerd[1613]: 2026-01-27 05:56:32.239 [INFO][4348] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0", GenerateName:"calico-apiserver-848cb947c-", Namespace:"calico-apiserver", SelfLink:"", UID:"124b40a0-d1a3-4e06-b27d-7331549e3e87", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848cb947c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"calico-apiserver-848cb947c-w2cj4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali28a74c36ee8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:32.309150 containerd[1613]: 2026-01-27 05:56:32.240 [INFO][4348] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.3/32] ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" Jan 27 05:56:32.309150 containerd[1613]: 2026-01-27 05:56:32.241 [INFO][4348] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali28a74c36ee8 ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" Jan 27 05:56:32.309150 containerd[1613]: 2026-01-27 05:56:32.262 [INFO][4348] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" Jan 27 05:56:32.309150 containerd[1613]: 2026-01-27 05:56:32.274 [INFO][4348] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0", GenerateName:"calico-apiserver-848cb947c-", Namespace:"calico-apiserver", SelfLink:"", UID:"124b40a0-d1a3-4e06-b27d-7331549e3e87", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848cb947c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e", Pod:"calico-apiserver-848cb947c-w2cj4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali28a74c36ee8", MAC:"3e:cf:e3:0b:79:91", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:32.309150 containerd[1613]: 2026-01-27 05:56:32.295 [INFO][4348] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-w2cj4" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--w2cj4-eth0" Jan 27 05:56:32.323000 audit[4418]: NETFILTER_CFG table=filter:127 family=2 entries=42 op=nft_register_chain pid=4418 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:32.330665 kernel: kauditd_printk_skb: 231 callbacks suppressed Jan 27 05:56:32.330776 kernel: audit: type=1325 audit(1769493392.323:639): table=filter:127 family=2 entries=42 op=nft_register_chain pid=4418 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:32.323000 audit[4418]: SYSCALL arch=c000003e syscall=46 success=yes exit=22552 a0=3 a1=7ffdfb5ce1f0 a2=0 a3=7ffdfb5ce1dc items=0 ppid=4106 pid=4418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.381412 kernel: audit: type=1300 audit(1769493392.323:639): arch=c000003e syscall=46 success=yes exit=22552 a0=3 a1=7ffdfb5ce1f0 a2=0 a3=7ffdfb5ce1dc items=0 ppid=4106 pid=4418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.386385 systemd[1]: Started cri-containerd-91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4.scope - libcontainer container 91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4. Jan 27 05:56:32.323000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:32.408535 kernel: audit: type=1327 audit(1769493392.323:639): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:32.446770 containerd[1613]: time="2026-01-27T05:56:32.446706050Z" level=info msg="connecting to shim bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e" address="unix:///run/containerd/s/cafbe89472447f0b4f46c297416a781aacbe11ea873d578145755e9f0e6d9c78" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:32.443000 audit[4447]: NETFILTER_CFG table=filter:128 family=2 entries=54 op=nft_register_chain pid=4447 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:32.467411 kernel: audit: type=1325 audit(1769493392.443:640): table=filter:128 family=2 entries=54 op=nft_register_chain pid=4447 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:32.443000 audit[4447]: SYSCALL arch=c000003e syscall=46 success=yes exit=29396 a0=3 a1=7ffc0f1ffd00 a2=0 a3=7ffc0f1ffcec items=0 ppid=4106 pid=4447 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.506403 kernel: audit: type=1300 audit(1769493392.443:640): arch=c000003e syscall=46 success=yes exit=29396 a0=3 a1=7ffc0f1ffd00 a2=0 a3=7ffc0f1ffcec items=0 ppid=4106 pid=4447 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.527496 kernel: audit: type=1327 audit(1769493392.443:640): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:32.443000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:32.471000 audit: BPF prog-id=212 op=LOAD Jan 27 05:56:32.535395 kernel: audit: type=1334 audit(1769493392.471:641): prog-id=212 op=LOAD Jan 27 05:56:32.506000 audit: BPF prog-id=213 op=LOAD Jan 27 05:56:32.506000 audit[4417]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.572454 kernel: audit: type=1334 audit(1769493392.506:642): prog-id=213 op=LOAD Jan 27 05:56:32.572996 kernel: audit: type=1300 audit(1769493392.506:642): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.573794 kernel: audit: type=1327 audit(1769493392.506:642): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.506000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.506000 audit: BPF prog-id=213 op=UNLOAD Jan 27 05:56:32.506000 audit[4417]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.506000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.506000 audit: BPF prog-id=214 op=LOAD Jan 27 05:56:32.506000 audit[4417]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.506000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.506000 audit: BPF prog-id=215 op=LOAD Jan 27 05:56:32.506000 audit[4417]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.506000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.506000 audit: BPF prog-id=215 op=UNLOAD Jan 27 05:56:32.506000 audit[4417]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.506000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.506000 audit: BPF prog-id=214 op=UNLOAD Jan 27 05:56:32.506000 audit[4417]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.506000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.506000 audit: BPF prog-id=216 op=LOAD Jan 27 05:56:32.506000 audit[4417]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4404 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.506000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931623264306236346166653138333666613663313765653439323036 Jan 27 05:56:32.604708 systemd[1]: Started cri-containerd-bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e.scope - libcontainer container bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e. Jan 27 05:56:32.656811 containerd[1613]: time="2026-01-27T05:56:32.656607176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6r9xq,Uid:e77345b2-283b-459e-9e5a-f8eabb03a5ae,Namespace:kube-system,Attempt:0,} returns sandbox id \"91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4\"" Jan 27 05:56:32.666678 containerd[1613]: time="2026-01-27T05:56:32.665335141Z" level=info msg="CreateContainer within sandbox \"91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 27 05:56:32.681000 audit: BPF prog-id=217 op=LOAD Jan 27 05:56:32.683000 audit: BPF prog-id=218 op=LOAD Jan 27 05:56:32.683000 audit[4465]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4453 pid=4465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6266386261333830316639343734373635313165323537323266613330 Jan 27 05:56:32.683000 audit: BPF prog-id=218 op=UNLOAD Jan 27 05:56:32.683000 audit[4465]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4453 pid=4465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6266386261333830316639343734373635313165323537323266613330 Jan 27 05:56:32.683000 audit: BPF prog-id=219 op=LOAD Jan 27 05:56:32.683000 audit[4465]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4453 pid=4465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6266386261333830316639343734373635313165323537323266613330 Jan 27 05:56:32.683000 audit: BPF prog-id=220 op=LOAD Jan 27 05:56:32.683000 audit[4465]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4453 pid=4465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6266386261333830316639343734373635313165323537323266613330 Jan 27 05:56:32.683000 audit: BPF prog-id=220 op=UNLOAD Jan 27 05:56:32.683000 audit[4465]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4453 pid=4465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6266386261333830316639343734373635313165323537323266613330 Jan 27 05:56:32.683000 audit: BPF prog-id=219 op=UNLOAD Jan 27 05:56:32.683000 audit[4465]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4453 pid=4465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6266386261333830316639343734373635313165323537323266613330 Jan 27 05:56:32.684000 audit: BPF prog-id=221 op=LOAD Jan 27 05:56:32.684000 audit[4465]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4453 pid=4465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.684000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6266386261333830316639343734373635313165323537323266613330 Jan 27 05:56:32.689612 containerd[1613]: time="2026-01-27T05:56:32.687854119Z" level=info msg="Container b6c3816b6d86deb18c87a04220017105e9eacf3cd99ee0a9ec0e4a31d2f5b0c0: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:56:32.699040 containerd[1613]: time="2026-01-27T05:56:32.698910500Z" level=info msg="CreateContainer within sandbox \"91b2d0b64afe1836fa6c17ee49206109c7d71eb4ad3b6f2241d0e42929dc6fe4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b6c3816b6d86deb18c87a04220017105e9eacf3cd99ee0a9ec0e4a31d2f5b0c0\"" Jan 27 05:56:32.701392 containerd[1613]: time="2026-01-27T05:56:32.700933907Z" level=info msg="StartContainer for \"b6c3816b6d86deb18c87a04220017105e9eacf3cd99ee0a9ec0e4a31d2f5b0c0\"" Jan 27 05:56:32.704108 containerd[1613]: time="2026-01-27T05:56:32.704063673Z" level=info msg="connecting to shim b6c3816b6d86deb18c87a04220017105e9eacf3cd99ee0a9ec0e4a31d2f5b0c0" address="unix:///run/containerd/s/9288b8287f9ad24ad1c6ffe34548fe99aa83c253a8a1b5713baf5f559b9969b9" protocol=ttrpc version=3 Jan 27 05:56:32.753538 systemd[1]: Started cri-containerd-b6c3816b6d86deb18c87a04220017105e9eacf3cd99ee0a9ec0e4a31d2f5b0c0.scope - libcontainer container b6c3816b6d86deb18c87a04220017105e9eacf3cd99ee0a9ec0e4a31d2f5b0c0. Jan 27 05:56:32.784104 containerd[1613]: time="2026-01-27T05:56:32.784053201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-w2cj4,Uid:124b40a0-d1a3-4e06-b27d-7331549e3e87,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bf8ba3801f947476511e25722fa305c51d6f6f9a3035ce97d300c2223fa1f10e\"" Jan 27 05:56:32.789492 containerd[1613]: time="2026-01-27T05:56:32.789449482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 05:56:32.812000 audit: BPF prog-id=222 op=LOAD Jan 27 05:56:32.812000 audit: BPF prog-id=223 op=LOAD Jan 27 05:56:32.812000 audit[4490]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4404 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.812000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236633338313662366438366465623138633837613034323230303137 Jan 27 05:56:32.813000 audit: BPF prog-id=223 op=UNLOAD Jan 27 05:56:32.813000 audit[4490]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4404 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236633338313662366438366465623138633837613034323230303137 Jan 27 05:56:32.813000 audit: BPF prog-id=224 op=LOAD Jan 27 05:56:32.813000 audit[4490]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4404 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236633338313662366438366465623138633837613034323230303137 Jan 27 05:56:32.813000 audit: BPF prog-id=225 op=LOAD Jan 27 05:56:32.813000 audit[4490]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4404 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236633338313662366438366465623138633837613034323230303137 Jan 27 05:56:32.813000 audit: BPF prog-id=225 op=UNLOAD Jan 27 05:56:32.813000 audit[4490]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4404 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236633338313662366438366465623138633837613034323230303137 Jan 27 05:56:32.813000 audit: BPF prog-id=224 op=UNLOAD Jan 27 05:56:32.813000 audit[4490]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4404 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236633338313662366438366465623138633837613034323230303137 Jan 27 05:56:32.813000 audit: BPF prog-id=226 op=LOAD Jan 27 05:56:32.813000 audit[4490]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4404 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:32.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236633338313662366438366465623138633837613034323230303137 Jan 27 05:56:32.846855 containerd[1613]: time="2026-01-27T05:56:32.846735579Z" level=info msg="StartContainer for \"b6c3816b6d86deb18c87a04220017105e9eacf3cd99ee0a9ec0e4a31d2f5b0c0\" returns successfully" Jan 27 05:56:32.853139 containerd[1613]: time="2026-01-27T05:56:32.852680244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56c68bdb6-v2r5d,Uid:abcbc1d1-af63-4772-a8eb-6b5783d69e07,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:32.862023 containerd[1613]: time="2026-01-27T05:56:32.852887757Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s7bgd,Uid:8f740da6-d731-4b30-bf8e-ada1ccd8b61b,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:32.975047 containerd[1613]: time="2026-01-27T05:56:32.974905903Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:32.977641 containerd[1613]: time="2026-01-27T05:56:32.977483510Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 05:56:32.978031 containerd[1613]: time="2026-01-27T05:56:32.977885211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:32.978594 kubelet[2857]: E0127 05:56:32.978536 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:32.979110 kubelet[2857]: E0127 05:56:32.978612 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:32.979110 kubelet[2857]: E0127 05:56:32.978805 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwtvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-848cb947c-w2cj4_calico-apiserver(124b40a0-d1a3-4e06-b27d-7331549e3e87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:32.980211 kubelet[2857]: E0127 05:56:32.979968 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:56:33.112152 systemd-networkd[1499]: calica029a8b5e7: Link UP Jan 27 05:56:33.114404 systemd-networkd[1499]: calica029a8b5e7: Gained carrier Jan 27 05:56:33.133045 kubelet[2857]: E0127 05:56:33.132997 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:56:33.143649 systemd-networkd[1499]: calia8333c7bfc3: Gained IPv6LL Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:32.990 [INFO][4528] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0 calico-kube-controllers-56c68bdb6- calico-system abcbc1d1-af63-4772-a8eb-6b5783d69e07 821 0 2026-01-27 05:56:07 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:56c68bdb6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf calico-kube-controllers-56c68bdb6-v2r5d eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calica029a8b5e7 [] [] }} ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:32.990 [INFO][4528] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.050 [INFO][4553] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" HandleID="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.051 [INFO][4553] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" HandleID="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039dad0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"calico-kube-controllers-56c68bdb6-v2r5d", "timestamp":"2026-01-27 05:56:33.05074044 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.051 [INFO][4553] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.051 [INFO][4553] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.051 [INFO][4553] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.064 [INFO][4553] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.069 [INFO][4553] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.074 [INFO][4553] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.076 [INFO][4553] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.079 [INFO][4553] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.079 [INFO][4553] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.080 [INFO][4553] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.085 [INFO][4553] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.094 [INFO][4553] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.4/26] block=192.168.79.0/26 handle="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.094 [INFO][4553] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.4/26] handle="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.094 [INFO][4553] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:33.150943 containerd[1613]: 2026-01-27 05:56:33.094 [INFO][4553] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.4/26] IPv6=[] ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" HandleID="k8s-pod-network.7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" Jan 27 05:56:33.153319 containerd[1613]: 2026-01-27 05:56:33.098 [INFO][4528] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0", GenerateName:"calico-kube-controllers-56c68bdb6-", Namespace:"calico-system", SelfLink:"", UID:"abcbc1d1-af63-4772-a8eb-6b5783d69e07", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56c68bdb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"calico-kube-controllers-56c68bdb6-v2r5d", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.79.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica029a8b5e7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:33.153319 containerd[1613]: 2026-01-27 05:56:33.098 [INFO][4528] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.4/32] ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" Jan 27 05:56:33.153319 containerd[1613]: 2026-01-27 05:56:33.098 [INFO][4528] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica029a8b5e7 ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" Jan 27 05:56:33.153319 containerd[1613]: 2026-01-27 05:56:33.115 [INFO][4528] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" Jan 27 05:56:33.153319 containerd[1613]: 2026-01-27 05:56:33.116 [INFO][4528] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0", GenerateName:"calico-kube-controllers-56c68bdb6-", Namespace:"calico-system", SelfLink:"", UID:"abcbc1d1-af63-4772-a8eb-6b5783d69e07", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56c68bdb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be", Pod:"calico-kube-controllers-56c68bdb6-v2r5d", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.79.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica029a8b5e7", MAC:"0e:3c:af:d5:a1:8e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:33.153319 containerd[1613]: 2026-01-27 05:56:33.136 [INFO][4528] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" Namespace="calico-system" Pod="calico-kube-controllers-56c68bdb6-v2r5d" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--kube--controllers--56c68bdb6--v2r5d-eth0" Jan 27 05:56:33.207663 kubelet[2857]: I0127 05:56:33.207405 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-6r9xq" podStartSLOduration=44.207230181 podStartE2EDuration="44.207230181s" podCreationTimestamp="2026-01-27 05:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 05:56:33.206112298 +0000 UTC m=+48.551113618" watchObservedRunningTime="2026-01-27 05:56:33.207230181 +0000 UTC m=+48.552231519" Jan 27 05:56:33.234900 containerd[1613]: time="2026-01-27T05:56:33.233468917Z" level=info msg="connecting to shim 7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be" address="unix:///run/containerd/s/0ea00edc4313c71f795951f3f8f4b1de283053b24b76699c66f2419ad5d0b917" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:33.246000 audit[4587]: NETFILTER_CFG table=filter:129 family=2 entries=44 op=nft_register_chain pid=4587 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:33.246000 audit[4587]: SYSCALL arch=c000003e syscall=46 success=yes exit=21952 a0=3 a1=7fffa6f98460 a2=0 a3=7fffa6f9844c items=0 ppid=4106 pid=4587 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.246000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:33.256000 audit[4592]: NETFILTER_CFG table=filter:130 family=2 entries=20 op=nft_register_rule pid=4592 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:33.256000 audit[4592]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffe6a770f90 a2=0 a3=7ffe6a770f7c items=0 ppid=2979 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.256000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:33.276000 audit[4592]: NETFILTER_CFG table=nat:131 family=2 entries=14 op=nft_register_rule pid=4592 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:33.276000 audit[4592]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffe6a770f90 a2=0 a3=0 items=0 ppid=2979 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.276000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:33.303664 systemd[1]: Started cri-containerd-7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be.scope - libcontainer container 7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be. Jan 27 05:56:33.306755 systemd-networkd[1499]: calid4dee8e52b8: Link UP Jan 27 05:56:33.310461 systemd-networkd[1499]: calid4dee8e52b8: Gained carrier Jan 27 05:56:33.321000 audit[4615]: NETFILTER_CFG table=filter:132 family=2 entries=20 op=nft_register_rule pid=4615 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:33.321000 audit[4615]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd633d3740 a2=0 a3=7ffd633d372c items=0 ppid=2979 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.321000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:33.326000 audit[4615]: NETFILTER_CFG table=nat:133 family=2 entries=14 op=nft_register_rule pid=4615 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:33.326000 audit[4615]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffd633d3740 a2=0 a3=0 items=0 ppid=2979 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.326000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.011 [INFO][4529] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0 csi-node-driver- calico-system 8f740da6-d731-4b30-bf8e-ada1ccd8b61b 694 0 2026-01-27 05:56:07 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf csi-node-driver-s7bgd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid4dee8e52b8 [] [] }} ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.011 [INFO][4529] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.066 [INFO][4559] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" HandleID="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.066 [INFO][4559] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" HandleID="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f0f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"csi-node-driver-s7bgd", "timestamp":"2026-01-27 05:56:33.066423967 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.066 [INFO][4559] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.094 [INFO][4559] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.094 [INFO][4559] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.170 [INFO][4559] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.202 [INFO][4559] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.227 [INFO][4559] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.229 [INFO][4559] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.239 [INFO][4559] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.239 [INFO][4559] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.242 [INFO][4559] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261 Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.258 [INFO][4559] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.276 [INFO][4559] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.5/26] block=192.168.79.0/26 handle="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.276 [INFO][4559] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.5/26] handle="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.276 [INFO][4559] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:33.336247 containerd[1613]: 2026-01-27 05:56:33.276 [INFO][4559] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.5/26] IPv6=[] ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" HandleID="k8s-pod-network.2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" Jan 27 05:56:33.337788 containerd[1613]: 2026-01-27 05:56:33.286 [INFO][4529] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8f740da6-d731-4b30-bf8e-ada1ccd8b61b", ResourceVersion:"694", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"csi-node-driver-s7bgd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.79.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid4dee8e52b8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:33.337788 containerd[1613]: 2026-01-27 05:56:33.288 [INFO][4529] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.5/32] ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" Jan 27 05:56:33.337788 containerd[1613]: 2026-01-27 05:56:33.288 [INFO][4529] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid4dee8e52b8 ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" Jan 27 05:56:33.337788 containerd[1613]: 2026-01-27 05:56:33.317 [INFO][4529] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" Jan 27 05:56:33.337788 containerd[1613]: 2026-01-27 05:56:33.318 [INFO][4529] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8f740da6-d731-4b30-bf8e-ada1ccd8b61b", ResourceVersion:"694", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261", Pod:"csi-node-driver-s7bgd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.79.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid4dee8e52b8", MAC:"e2:59:8c:5b:3f:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:33.337788 containerd[1613]: 2026-01-27 05:56:33.331 [INFO][4529] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" Namespace="calico-system" Pod="csi-node-driver-s7bgd" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-csi--node--driver--s7bgd-eth0" Jan 27 05:56:33.362000 audit: BPF prog-id=227 op=LOAD Jan 27 05:56:33.364000 audit: BPF prog-id=228 op=LOAD Jan 27 05:56:33.364000 audit[4599]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4585 pid=4599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.364000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763623462303464323036326639396535356332313131633865663735 Jan 27 05:56:33.364000 audit: BPF prog-id=228 op=UNLOAD Jan 27 05:56:33.364000 audit[4599]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.364000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763623462303464323036326639396535356332313131633865663735 Jan 27 05:56:33.364000 audit: BPF prog-id=229 op=LOAD Jan 27 05:56:33.364000 audit[4599]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4585 pid=4599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.364000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763623462303464323036326639396535356332313131633865663735 Jan 27 05:56:33.364000 audit: BPF prog-id=230 op=LOAD Jan 27 05:56:33.364000 audit[4599]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4585 pid=4599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.364000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763623462303464323036326639396535356332313131633865663735 Jan 27 05:56:33.364000 audit: BPF prog-id=230 op=UNLOAD Jan 27 05:56:33.364000 audit[4599]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.364000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763623462303464323036326639396535356332313131633865663735 Jan 27 05:56:33.364000 audit: BPF prog-id=229 op=UNLOAD Jan 27 05:56:33.364000 audit[4599]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.364000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763623462303464323036326639396535356332313131633865663735 Jan 27 05:56:33.364000 audit: BPF prog-id=231 op=LOAD Jan 27 05:56:33.364000 audit[4599]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4585 pid=4599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.364000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763623462303464323036326639396535356332313131633865663735 Jan 27 05:56:33.381087 containerd[1613]: time="2026-01-27T05:56:33.380986294Z" level=info msg="connecting to shim 2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261" address="unix:///run/containerd/s/bd12fbb0ced2c76cb91749a29312bfafba867b103093035007734903f1ad4247" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:33.402000 audit[4646]: NETFILTER_CFG table=filter:134 family=2 entries=54 op=nft_register_chain pid=4646 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:33.402000 audit[4646]: SYSCALL arch=c000003e syscall=46 success=yes exit=25992 a0=3 a1=7ffeee98c700 a2=0 a3=7ffeee98c6ec items=0 ppid=4106 pid=4646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.402000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:33.423632 systemd[1]: Started cri-containerd-2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261.scope - libcontainer container 2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261. Jan 27 05:56:33.449292 containerd[1613]: time="2026-01-27T05:56:33.449185274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56c68bdb6-v2r5d,Uid:abcbc1d1-af63-4772-a8eb-6b5783d69e07,Namespace:calico-system,Attempt:0,} returns sandbox id \"7cb4b04d2062f99e55c2111c8ef75f2f780753cb2dabde8903570403e5fbf1be\"" Jan 27 05:56:33.452514 containerd[1613]: time="2026-01-27T05:56:33.452352680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 05:56:33.455000 audit: BPF prog-id=232 op=LOAD Jan 27 05:56:33.455000 audit: BPF prog-id=233 op=LOAD Jan 27 05:56:33.455000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.455000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262646532623162626639306362386566363462336665336535333562 Jan 27 05:56:33.455000 audit: BPF prog-id=233 op=UNLOAD Jan 27 05:56:33.455000 audit[4649]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.455000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262646532623162626639306362386566363462336665336535333562 Jan 27 05:56:33.456000 audit: BPF prog-id=234 op=LOAD Jan 27 05:56:33.456000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262646532623162626639306362386566363462336665336535333562 Jan 27 05:56:33.456000 audit: BPF prog-id=235 op=LOAD Jan 27 05:56:33.456000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262646532623162626639306362386566363462336665336535333562 Jan 27 05:56:33.456000 audit: BPF prog-id=235 op=UNLOAD Jan 27 05:56:33.456000 audit[4649]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262646532623162626639306362386566363462336665336535333562 Jan 27 05:56:33.456000 audit: BPF prog-id=234 op=UNLOAD Jan 27 05:56:33.456000 audit[4649]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262646532623162626639306362386566363462336665336535333562 Jan 27 05:56:33.456000 audit: BPF prog-id=236 op=LOAD Jan 27 05:56:33.456000 audit[4649]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4638 pid=4649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:33.456000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262646532623162626639306362386566363462336665336535333562 Jan 27 05:56:33.481315 containerd[1613]: time="2026-01-27T05:56:33.481252958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s7bgd,Uid:8f740da6-d731-4b30-bf8e-ada1ccd8b61b,Namespace:calico-system,Attempt:0,} returns sandbox id \"2bde2b1bbf90cb8ef64b3fe3e535bb3c7544a5ea380be2ce612e1feacc8ed261\"" Jan 27 05:56:33.609835 containerd[1613]: time="2026-01-27T05:56:33.609647911Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:33.611979 containerd[1613]: time="2026-01-27T05:56:33.611695879Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 05:56:33.611979 containerd[1613]: time="2026-01-27T05:56:33.611781480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:33.612534 kubelet[2857]: E0127 05:56:33.612427 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:56:33.612922 kubelet[2857]: E0127 05:56:33.612735 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:56:33.613247 kubelet[2857]: E0127 05:56:33.613175 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h46mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56c68bdb6-v2r5d_calico-system(abcbc1d1-af63-4772-a8eb-6b5783d69e07): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:33.613836 containerd[1613]: time="2026-01-27T05:56:33.613788966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 05:56:33.614545 kubelet[2857]: E0127 05:56:33.614506 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:56:33.772113 containerd[1613]: time="2026-01-27T05:56:33.772024434Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:33.773762 containerd[1613]: time="2026-01-27T05:56:33.773656800Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 05:56:33.773897 containerd[1613]: time="2026-01-27T05:56:33.773835310Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:33.774388 kubelet[2857]: E0127 05:56:33.774131 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:56:33.774388 kubelet[2857]: E0127 05:56:33.774193 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:56:33.774576 kubelet[2857]: E0127 05:56:33.774348 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:33.776868 containerd[1613]: time="2026-01-27T05:56:33.776715101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 05:56:33.851637 containerd[1613]: time="2026-01-27T05:56:33.851584361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sf965,Uid:72441b5b-b7f3-4945-b29d-35dc24b39ff2,Namespace:kube-system,Attempt:0,}" Jan 27 05:56:33.852180 containerd[1613]: time="2026-01-27T05:56:33.852137911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-xgb9b,Uid:e4b3467b-adcb-4738-9feb-ff8bcf1c33fe,Namespace:calico-apiserver,Attempt:0,}" Jan 27 05:56:33.852806 containerd[1613]: time="2026-01-27T05:56:33.852489511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4mkcq,Uid:13975170-e4d2-41c8-9e0a-f42e4f517791,Namespace:calico-system,Attempt:0,}" Jan 27 05:56:33.914844 systemd-networkd[1499]: cali28a74c36ee8: Gained IPv6LL Jan 27 05:56:33.951507 containerd[1613]: time="2026-01-27T05:56:33.951448463Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:33.955386 containerd[1613]: time="2026-01-27T05:56:33.955158450Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 05:56:33.955859 containerd[1613]: time="2026-01-27T05:56:33.955306482Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:33.956848 kubelet[2857]: E0127 05:56:33.956670 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:56:33.956848 kubelet[2857]: E0127 05:56:33.956746 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:56:33.958936 kubelet[2857]: E0127 05:56:33.958856 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:33.961008 kubelet[2857]: E0127 05:56:33.960760 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:34.160866 kubelet[2857]: E0127 05:56:34.160808 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:34.169745 kubelet[2857]: E0127 05:56:34.169528 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:56:34.170670 kubelet[2857]: E0127 05:56:34.170412 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:56:34.221794 systemd-networkd[1499]: cali35fd8739de1: Link UP Jan 27 05:56:34.223687 systemd-networkd[1499]: cali35fd8739de1: Gained carrier Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.013 [INFO][4693] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0 calico-apiserver-848cb947c- calico-apiserver e4b3467b-adcb-4738-9feb-ff8bcf1c33fe 822 0 2026-01-27 05:56:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:848cb947c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf calico-apiserver-848cb947c-xgb9b eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali35fd8739de1 [] [] }} ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.013 [INFO][4693] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.120 [INFO][4719] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" HandleID="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.120 [INFO][4719] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" HandleID="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fe00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"calico-apiserver-848cb947c-xgb9b", "timestamp":"2026-01-27 05:56:34.120198133 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.122 [INFO][4719] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.123 [INFO][4719] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.124 [INFO][4719] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.146 [INFO][4719] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.157 [INFO][4719] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.167 [INFO][4719] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.174 [INFO][4719] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.179 [INFO][4719] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.179 [INFO][4719] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.183 [INFO][4719] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8 Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.192 [INFO][4719] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.202 [INFO][4719] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.6/26] block=192.168.79.0/26 handle="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.202 [INFO][4719] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.6/26] handle="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.202 [INFO][4719] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:34.249703 containerd[1613]: 2026-01-27 05:56:34.202 [INFO][4719] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.6/26] IPv6=[] ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" HandleID="k8s-pod-network.2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" Jan 27 05:56:34.251823 containerd[1613]: 2026-01-27 05:56:34.212 [INFO][4693] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0", GenerateName:"calico-apiserver-848cb947c-", Namespace:"calico-apiserver", SelfLink:"", UID:"e4b3467b-adcb-4738-9feb-ff8bcf1c33fe", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848cb947c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"calico-apiserver-848cb947c-xgb9b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali35fd8739de1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:34.251823 containerd[1613]: 2026-01-27 05:56:34.213 [INFO][4693] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.6/32] ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" Jan 27 05:56:34.251823 containerd[1613]: 2026-01-27 05:56:34.213 [INFO][4693] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali35fd8739de1 ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" Jan 27 05:56:34.251823 containerd[1613]: 2026-01-27 05:56:34.225 [INFO][4693] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" Jan 27 05:56:34.251823 containerd[1613]: 2026-01-27 05:56:34.225 [INFO][4693] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0", GenerateName:"calico-apiserver-848cb947c-", Namespace:"calico-apiserver", SelfLink:"", UID:"e4b3467b-adcb-4738-9feb-ff8bcf1c33fe", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848cb947c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8", Pod:"calico-apiserver-848cb947c-xgb9b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali35fd8739de1", MAC:"b6:ea:19:95:9b:2e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:34.251823 containerd[1613]: 2026-01-27 05:56:34.244 [INFO][4693] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" Namespace="calico-apiserver" Pod="calico-apiserver-848cb947c-xgb9b" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-calico--apiserver--848cb947c--xgb9b-eth0" Jan 27 05:56:34.319599 containerd[1613]: time="2026-01-27T05:56:34.319539505Z" level=info msg="connecting to shim 2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8" address="unix:///run/containerd/s/be9a71c7a9cffe321d5b28848b0f3e6d7ee2e2c240f04bc609501ae5f3965f39" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:34.388869 systemd-networkd[1499]: calif503418e796: Link UP Jan 27 05:56:34.389234 systemd-networkd[1499]: calif503418e796: Gained carrier Jan 27 05:56:34.418000 audit[4779]: NETFILTER_CFG table=filter:135 family=2 entries=49 op=nft_register_chain pid=4779 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:34.423497 systemd-networkd[1499]: calica029a8b5e7: Gained IPv6LL Jan 27 05:56:34.418000 audit[4779]: SYSCALL arch=c000003e syscall=46 success=yes exit=25436 a0=3 a1=7ffe39b0a9f0 a2=0 a3=7ffe39b0a9dc items=0 ppid=4106 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.418000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.034 [INFO][4684] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0 coredns-668d6bf9bc- kube-system 72441b5b-b7f3-4945-b29d-35dc24b39ff2 817 0 2026-01-27 05:55:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf coredns-668d6bf9bc-sf965 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif503418e796 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.035 [INFO][4684] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.127 [INFO][4725] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" HandleID="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.129 [INFO][4725] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" HandleID="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b240), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"coredns-668d6bf9bc-sf965", "timestamp":"2026-01-27 05:56:34.12723979 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.130 [INFO][4725] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.203 [INFO][4725] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.203 [INFO][4725] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.258 [INFO][4725] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.271 [INFO][4725] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.284 [INFO][4725] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.295 [INFO][4725] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.301 [INFO][4725] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.301 [INFO][4725] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.306 [INFO][4725] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7 Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.330 [INFO][4725] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.355 [INFO][4725] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.7/26] block=192.168.79.0/26 handle="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.355 [INFO][4725] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.7/26] handle="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.356 [INFO][4725] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:34.445260 containerd[1613]: 2026-01-27 05:56:34.356 [INFO][4725] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.7/26] IPv6=[] ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" HandleID="k8s-pod-network.c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" Jan 27 05:56:34.446320 containerd[1613]: 2026-01-27 05:56:34.369 [INFO][4684] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"72441b5b-b7f3-4945-b29d-35dc24b39ff2", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 55, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"coredns-668d6bf9bc-sf965", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif503418e796", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:34.446320 containerd[1613]: 2026-01-27 05:56:34.372 [INFO][4684] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.7/32] ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" Jan 27 05:56:34.446320 containerd[1613]: 2026-01-27 05:56:34.373 [INFO][4684] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif503418e796 ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" Jan 27 05:56:34.446320 containerd[1613]: 2026-01-27 05:56:34.390 [INFO][4684] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" Jan 27 05:56:34.446320 containerd[1613]: 2026-01-27 05:56:34.397 [INFO][4684] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"72441b5b-b7f3-4945-b29d-35dc24b39ff2", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 55, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7", Pod:"coredns-668d6bf9bc-sf965", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif503418e796", MAC:"0a:0f:f1:4e:cb:fe", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:34.446320 containerd[1613]: 2026-01-27 05:56:34.431 [INFO][4684] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" Namespace="kube-system" Pod="coredns-668d6bf9bc-sf965" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-coredns--668d6bf9bc--sf965-eth0" Jan 27 05:56:34.448695 systemd[1]: Started cri-containerd-2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8.scope - libcontainer container 2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8. Jan 27 05:56:34.455000 audit[4785]: NETFILTER_CFG table=filter:136 family=2 entries=17 op=nft_register_rule pid=4785 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:34.455000 audit[4785]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff0e20dc90 a2=0 a3=7fff0e20dc7c items=0 ppid=2979 pid=4785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.455000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:34.471000 audit[4785]: NETFILTER_CFG table=nat:137 family=2 entries=35 op=nft_register_chain pid=4785 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:34.471000 audit[4785]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7fff0e20dc90 a2=0 a3=7fff0e20dc7c items=0 ppid=2979 pid=4785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.471000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:34.511982 containerd[1613]: time="2026-01-27T05:56:34.511876666Z" level=info msg="connecting to shim c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7" address="unix:///run/containerd/s/c62d00bd656e2d24c48121ae7da27b9e42b304897c071e3fd8deecbe0b6f3fbe" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:34.517483 systemd-networkd[1499]: cali0943ceaac2e: Link UP Jan 27 05:56:34.519172 systemd-networkd[1499]: cali0943ceaac2e: Gained carrier Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.044 [INFO][4683] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0 goldmane-666569f655- calico-system 13975170-e4d2-41c8-9e0a-f42e4f517791 818 0 2026-01-27 05:56:05 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf goldmane-666569f655-4mkcq eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0943ceaac2e [] [] }} ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.045 [INFO][4683] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.150 [INFO][4729] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" HandleID="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.153 [INFO][4729] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" HandleID="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332140), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", "pod":"goldmane-666569f655-4mkcq", "timestamp":"2026-01-27 05:56:34.149617423 +0000 UTC"}, Hostname:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.154 [INFO][4729] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.357 [INFO][4729] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.357 [INFO][4729] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf' Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.383 [INFO][4729] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.418 [INFO][4729] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.450 [INFO][4729] ipam/ipam.go 511: Trying affinity for 192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.455 [INFO][4729] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.460 [INFO][4729] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.0/26 host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.460 [INFO][4729] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.0/26 handle="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.464 [INFO][4729] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71 Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.472 [INFO][4729] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.0/26 handle="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.487 [INFO][4729] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.8/26] block=192.168.79.0/26 handle="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.488 [INFO][4729] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.8/26] handle="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" host="ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf" Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.488 [INFO][4729] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 05:56:34.554739 containerd[1613]: 2026-01-27 05:56:34.488 [INFO][4729] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.8/26] IPv6=[] ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" HandleID="k8s-pod-network.1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Workload="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" Jan 27 05:56:34.558907 containerd[1613]: 2026-01-27 05:56:34.505 [INFO][4683] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"13975170-e4d2-41c8-9e0a-f42e4f517791", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"", Pod:"goldmane-666569f655-4mkcq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.79.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0943ceaac2e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:34.558907 containerd[1613]: 2026-01-27 05:56:34.506 [INFO][4683] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.8/32] ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" Jan 27 05:56:34.558907 containerd[1613]: 2026-01-27 05:56:34.506 [INFO][4683] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0943ceaac2e ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" Jan 27 05:56:34.558907 containerd[1613]: 2026-01-27 05:56:34.520 [INFO][4683] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" Jan 27 05:56:34.558907 containerd[1613]: 2026-01-27 05:56:34.522 [INFO][4683] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"13975170-e4d2-41c8-9e0a-f42e4f517791", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 5, 56, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-nightly-20260126-2100-93cea66762a12710a5bf", ContainerID:"1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71", Pod:"goldmane-666569f655-4mkcq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.79.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0943ceaac2e", MAC:"42:05:27:db:46:06", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 05:56:34.558907 containerd[1613]: 2026-01-27 05:56:34.548 [INFO][4683] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" Namespace="calico-system" Pod="goldmane-666569f655-4mkcq" WorkloadEndpoint="ci--4592--0--0--nightly--20260126--2100--93cea66762a12710a5bf-k8s-goldmane--666569f655--4mkcq-eth0" Jan 27 05:56:34.575000 audit: BPF prog-id=237 op=LOAD Jan 27 05:56:34.577000 audit: BPF prog-id=238 op=LOAD Jan 27 05:56:34.577000 audit[4770]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4756 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.577000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266343065636132383662333537386362323731356665666266363365 Jan 27 05:56:34.577000 audit: BPF prog-id=238 op=UNLOAD Jan 27 05:56:34.577000 audit[4770]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4756 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.577000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266343065636132383662333537386362323731356665666266363365 Jan 27 05:56:34.578000 audit: BPF prog-id=239 op=LOAD Jan 27 05:56:34.578000 audit[4770]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4756 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.578000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266343065636132383662333537386362323731356665666266363365 Jan 27 05:56:34.580000 audit: BPF prog-id=240 op=LOAD Jan 27 05:56:34.580000 audit[4770]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4756 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.580000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266343065636132383662333537386362323731356665666266363365 Jan 27 05:56:34.580000 audit[4839]: NETFILTER_CFG table=filter:138 family=2 entries=48 op=nft_register_chain pid=4839 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:34.580000 audit[4839]: SYSCALL arch=c000003e syscall=46 success=yes exit=22704 a0=3 a1=7fff1c3f0470 a2=0 a3=7fff1c3f045c items=0 ppid=4106 pid=4839 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.582000 audit: BPF prog-id=240 op=UNLOAD Jan 27 05:56:34.582000 audit[4770]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4756 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.580000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:34.582000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266343065636132383662333537386362323731356665666266363365 Jan 27 05:56:34.582000 audit: BPF prog-id=239 op=UNLOAD Jan 27 05:56:34.582000 audit[4770]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4756 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.582000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266343065636132383662333537386362323731356665666266363365 Jan 27 05:56:34.582000 audit: BPF prog-id=241 op=LOAD Jan 27 05:56:34.582000 audit[4770]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4756 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.582000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266343065636132383662333537386362323731356665666266363365 Jan 27 05:56:34.601178 systemd[1]: Started cri-containerd-c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7.scope - libcontainer container c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7. Jan 27 05:56:34.624000 audit: BPF prog-id=242 op=LOAD Jan 27 05:56:34.625000 audit: BPF prog-id=243 op=LOAD Jan 27 05:56:34.625000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4811 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339346363376465343037393939626432306161383466366164656338 Jan 27 05:56:34.625000 audit: BPF prog-id=243 op=UNLOAD Jan 27 05:56:34.625000 audit[4824]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4811 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339346363376465343037393939626432306161383466366164656338 Jan 27 05:56:34.625000 audit: BPF prog-id=244 op=LOAD Jan 27 05:56:34.625000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4811 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339346363376465343037393939626432306161383466366164656338 Jan 27 05:56:34.625000 audit: BPF prog-id=245 op=LOAD Jan 27 05:56:34.625000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4811 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339346363376465343037393939626432306161383466366164656338 Jan 27 05:56:34.625000 audit: BPF prog-id=245 op=UNLOAD Jan 27 05:56:34.625000 audit[4824]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4811 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339346363376465343037393939626432306161383466366164656338 Jan 27 05:56:34.625000 audit: BPF prog-id=244 op=UNLOAD Jan 27 05:56:34.625000 audit[4824]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4811 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339346363376465343037393939626432306161383466366164656338 Jan 27 05:56:34.625000 audit: BPF prog-id=246 op=LOAD Jan 27 05:56:34.625000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4811 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339346363376465343037393939626432306161383466366164656338 Jan 27 05:56:34.670011 containerd[1613]: time="2026-01-27T05:56:34.669864017Z" level=info msg="connecting to shim 1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71" address="unix:///run/containerd/s/5d3d1dab6f72e0906717a844b26c710b19b8e6443bfa0cc17c540971a6f84e58" namespace=k8s.io protocol=ttrpc version=3 Jan 27 05:56:34.734690 systemd[1]: Started cri-containerd-1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71.scope - libcontainer container 1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71. Jan 27 05:56:34.744000 audit[4888]: NETFILTER_CFG table=filter:139 family=2 entries=64 op=nft_register_chain pid=4888 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 05:56:34.744000 audit[4888]: SYSCALL arch=c000003e syscall=46 success=yes exit=31104 a0=3 a1=7ffd1d926640 a2=0 a3=7ffd1d92662c items=0 ppid=4106 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.744000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 05:56:34.753444 containerd[1613]: time="2026-01-27T05:56:34.753105923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sf965,Uid:72441b5b-b7f3-4945-b29d-35dc24b39ff2,Namespace:kube-system,Attempt:0,} returns sandbox id \"c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7\"" Jan 27 05:56:34.771653 containerd[1613]: time="2026-01-27T05:56:34.770509660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848cb947c-xgb9b,Uid:e4b3467b-adcb-4738-9feb-ff8bcf1c33fe,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2f40eca286b3578cb2715fefbf63e4734b2b22ed024a4d657831275b5c55dab8\"" Jan 27 05:56:34.774079 containerd[1613]: time="2026-01-27T05:56:34.774037077Z" level=info msg="CreateContainer within sandbox \"c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 27 05:56:34.785172 containerd[1613]: time="2026-01-27T05:56:34.785075181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 05:56:34.792070 containerd[1613]: time="2026-01-27T05:56:34.792032158Z" level=info msg="Container 483318487f9f9c7366ab7f519a1e6d1995c7ad6d0efe7c38f435b42a8a1bc0c4: CDI devices from CRI Config.CDIDevices: []" Jan 27 05:56:34.792000 audit: BPF prog-id=247 op=LOAD Jan 27 05:56:34.793000 audit: BPF prog-id=248 op=LOAD Jan 27 05:56:34.793000 audit[4879]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4861 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164353765303061663766306535373631623164343665346433636263 Jan 27 05:56:34.793000 audit: BPF prog-id=248 op=UNLOAD Jan 27 05:56:34.793000 audit[4879]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4861 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164353765303061663766306535373631623164343665346433636263 Jan 27 05:56:34.793000 audit: BPF prog-id=249 op=LOAD Jan 27 05:56:34.793000 audit[4879]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4861 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164353765303061663766306535373631623164343665346433636263 Jan 27 05:56:34.793000 audit: BPF prog-id=250 op=LOAD Jan 27 05:56:34.793000 audit[4879]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4861 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164353765303061663766306535373631623164343665346433636263 Jan 27 05:56:34.793000 audit: BPF prog-id=250 op=UNLOAD Jan 27 05:56:34.793000 audit[4879]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4861 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164353765303061663766306535373631623164343665346433636263 Jan 27 05:56:34.793000 audit: BPF prog-id=249 op=UNLOAD Jan 27 05:56:34.793000 audit[4879]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4861 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164353765303061663766306535373631623164343665346433636263 Jan 27 05:56:34.793000 audit: BPF prog-id=251 op=LOAD Jan 27 05:56:34.793000 audit[4879]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4861 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164353765303061663766306535373631623164343665346433636263 Jan 27 05:56:34.802454 containerd[1613]: time="2026-01-27T05:56:34.802338896Z" level=info msg="CreateContainer within sandbox \"c94cc7de407999bd20aa84f6adec89d754a8030d09e4f825b1ba51cd250eb7d7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"483318487f9f9c7366ab7f519a1e6d1995c7ad6d0efe7c38f435b42a8a1bc0c4\"" Jan 27 05:56:34.805729 containerd[1613]: time="2026-01-27T05:56:34.805435335Z" level=info msg="StartContainer for \"483318487f9f9c7366ab7f519a1e6d1995c7ad6d0efe7c38f435b42a8a1bc0c4\"" Jan 27 05:56:34.808541 containerd[1613]: time="2026-01-27T05:56:34.808493561Z" level=info msg="connecting to shim 483318487f9f9c7366ab7f519a1e6d1995c7ad6d0efe7c38f435b42a8a1bc0c4" address="unix:///run/containerd/s/c62d00bd656e2d24c48121ae7da27b9e42b304897c071e3fd8deecbe0b6f3fbe" protocol=ttrpc version=3 Jan 27 05:56:34.838750 systemd[1]: Started cri-containerd-483318487f9f9c7366ab7f519a1e6d1995c7ad6d0efe7c38f435b42a8a1bc0c4.scope - libcontainer container 483318487f9f9c7366ab7f519a1e6d1995c7ad6d0efe7c38f435b42a8a1bc0c4. Jan 27 05:56:34.888634 containerd[1613]: time="2026-01-27T05:56:34.888577058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4mkcq,Uid:13975170-e4d2-41c8-9e0a-f42e4f517791,Namespace:calico-system,Attempt:0,} returns sandbox id \"1d57e00af7f0e5761b1d46e4d3cbc567b12481a82cb22b8fdb12048ca291ba71\"" Jan 27 05:56:34.890000 audit: BPF prog-id=252 op=LOAD Jan 27 05:56:34.891000 audit: BPF prog-id=253 op=LOAD Jan 27 05:56:34.891000 audit[4908]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4811 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.891000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438333331383438376639663963373336366162376635313961316536 Jan 27 05:56:34.891000 audit: BPF prog-id=253 op=UNLOAD Jan 27 05:56:34.891000 audit[4908]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4811 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.891000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438333331383438376639663963373336366162376635313961316536 Jan 27 05:56:34.892000 audit: BPF prog-id=254 op=LOAD Jan 27 05:56:34.892000 audit[4908]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4811 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438333331383438376639663963373336366162376635313961316536 Jan 27 05:56:34.892000 audit: BPF prog-id=255 op=LOAD Jan 27 05:56:34.892000 audit[4908]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4811 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438333331383438376639663963373336366162376635313961316536 Jan 27 05:56:34.892000 audit: BPF prog-id=255 op=UNLOAD Jan 27 05:56:34.892000 audit[4908]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4811 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438333331383438376639663963373336366162376635313961316536 Jan 27 05:56:34.892000 audit: BPF prog-id=254 op=UNLOAD Jan 27 05:56:34.892000 audit[4908]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4811 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438333331383438376639663963373336366162376635313961316536 Jan 27 05:56:34.893000 audit: BPF prog-id=256 op=LOAD Jan 27 05:56:34.893000 audit[4908]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4811 pid=4908 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:34.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438333331383438376639663963373336366162376635313961316536 Jan 27 05:56:34.921385 containerd[1613]: time="2026-01-27T05:56:34.921272956Z" level=info msg="StartContainer for \"483318487f9f9c7366ab7f519a1e6d1995c7ad6d0efe7c38f435b42a8a1bc0c4\" returns successfully" Jan 27 05:56:34.936795 systemd-networkd[1499]: calid4dee8e52b8: Gained IPv6LL Jan 27 05:56:34.947477 containerd[1613]: time="2026-01-27T05:56:34.947419104Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:34.949208 containerd[1613]: time="2026-01-27T05:56:34.949149842Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 05:56:34.949450 containerd[1613]: time="2026-01-27T05:56:34.949270496Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:34.949641 kubelet[2857]: E0127 05:56:34.949512 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:34.949641 kubelet[2857]: E0127 05:56:34.949574 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:34.950831 containerd[1613]: time="2026-01-27T05:56:34.950356300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 05:56:34.951053 kubelet[2857]: E0127 05:56:34.950450 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jpxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-848cb947c-xgb9b_calico-apiserver(e4b3467b-adcb-4738-9feb-ff8bcf1c33fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:34.954069 kubelet[2857]: E0127 05:56:34.954006 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:56:35.111699 containerd[1613]: time="2026-01-27T05:56:35.111523511Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:35.114101 containerd[1613]: time="2026-01-27T05:56:35.113889374Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 05:56:35.114228 containerd[1613]: time="2026-01-27T05:56:35.114023260Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:35.114542 kubelet[2857]: E0127 05:56:35.114485 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 05:56:35.114692 kubelet[2857]: E0127 05:56:35.114554 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 05:56:35.115005 kubelet[2857]: E0127 05:56:35.114923 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4nzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4mkcq_calico-system(13975170-e4d2-41c8-9e0a-f42e4f517791): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:35.116489 kubelet[2857]: E0127 05:56:35.116437 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:56:35.174476 kubelet[2857]: E0127 05:56:35.174317 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:56:35.195392 kubelet[2857]: E0127 05:56:35.194689 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:56:35.197314 kubelet[2857]: E0127 05:56:35.196384 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:56:35.197768 kubelet[2857]: E0127 05:56:35.197684 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:35.249000 audit[4955]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=4955 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:35.249000 audit[4955]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe4f019030 a2=0 a3=7ffe4f01901c items=0 ppid=2979 pid=4955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:35.249000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:35.257000 audit[4955]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=4955 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:35.257000 audit[4955]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffe4f019030 a2=0 a3=7ffe4f01901c items=0 ppid=2979 pid=4955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:35.257000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:35.289479 kubelet[2857]: I0127 05:56:35.289405 2857 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-sf965" podStartSLOduration=46.289380083 podStartE2EDuration="46.289380083s" podCreationTimestamp="2026-01-27 05:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 05:56:35.288967773 +0000 UTC m=+50.633969094" watchObservedRunningTime="2026-01-27 05:56:35.289380083 +0000 UTC m=+50.634381410" Jan 27 05:56:35.575954 systemd-networkd[1499]: calif503418e796: Gained IPv6LL Jan 27 05:56:35.703724 systemd-networkd[1499]: cali0943ceaac2e: Gained IPv6LL Jan 27 05:56:36.023836 systemd-networkd[1499]: cali35fd8739de1: Gained IPv6LL Jan 27 05:56:36.194899 kubelet[2857]: E0127 05:56:36.194673 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:56:36.194899 kubelet[2857]: E0127 05:56:36.194673 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:56:36.277000 audit[4957]: NETFILTER_CFG table=filter:142 family=2 entries=14 op=nft_register_rule pid=4957 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:36.277000 audit[4957]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fffe35dc040 a2=0 a3=7fffe35dc02c items=0 ppid=2979 pid=4957 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:36.277000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:36.288000 audit[4957]: NETFILTER_CFG table=nat:143 family=2 entries=56 op=nft_register_chain pid=4957 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:56:36.288000 audit[4957]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7fffe35dc040 a2=0 a3=7fffe35dc02c items=0 ppid=2979 pid=4957 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:36.288000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:56:38.190761 ntpd[1570]: Listen normally on 7 vxlan.calico 192.168.79.0:123 Jan 27 05:56:38.190875 ntpd[1570]: Listen normally on 8 calid3de08f22d4 [fe80::ecee:eeff:feee:eeee%4]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 7 vxlan.calico 192.168.79.0:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 8 calid3de08f22d4 [fe80::ecee:eeff:feee:eeee%4]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 9 vxlan.calico [fe80::6407:c0ff:feb3:1407%5]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 10 calia8333c7bfc3 [fe80::ecee:eeff:feee:eeee%8]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 11 cali28a74c36ee8 [fe80::ecee:eeff:feee:eeee%9]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 12 calica029a8b5e7 [fe80::ecee:eeff:feee:eeee%10]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 13 calid4dee8e52b8 [fe80::ecee:eeff:feee:eeee%11]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 14 cali35fd8739de1 [fe80::ecee:eeff:feee:eeee%12]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 15 calif503418e796 [fe80::ecee:eeff:feee:eeee%13]:123 Jan 27 05:56:38.191515 ntpd[1570]: 27 Jan 05:56:38 ntpd[1570]: Listen normally on 16 cali0943ceaac2e [fe80::ecee:eeff:feee:eeee%14]:123 Jan 27 05:56:38.190923 ntpd[1570]: Listen normally on 9 vxlan.calico [fe80::6407:c0ff:feb3:1407%5]:123 Jan 27 05:56:38.190964 ntpd[1570]: Listen normally on 10 calia8333c7bfc3 [fe80::ecee:eeff:feee:eeee%8]:123 Jan 27 05:56:38.191009 ntpd[1570]: Listen normally on 11 cali28a74c36ee8 [fe80::ecee:eeff:feee:eeee%9]:123 Jan 27 05:56:38.191050 ntpd[1570]: Listen normally on 12 calica029a8b5e7 [fe80::ecee:eeff:feee:eeee%10]:123 Jan 27 05:56:38.191095 ntpd[1570]: Listen normally on 13 calid4dee8e52b8 [fe80::ecee:eeff:feee:eeee%11]:123 Jan 27 05:56:38.191136 ntpd[1570]: Listen normally on 14 cali35fd8739de1 [fe80::ecee:eeff:feee:eeee%12]:123 Jan 27 05:56:38.191176 ntpd[1570]: Listen normally on 15 calif503418e796 [fe80::ecee:eeff:feee:eeee%13]:123 Jan 27 05:56:38.191215 ntpd[1570]: Listen normally on 16 cali0943ceaac2e [fe80::ecee:eeff:feee:eeee%14]:123 Jan 27 05:56:42.852396 containerd[1613]: time="2026-01-27T05:56:42.851417013Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 05:56:43.012616 containerd[1613]: time="2026-01-27T05:56:43.012542323Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:43.014490 containerd[1613]: time="2026-01-27T05:56:43.014424733Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 05:56:43.014690 containerd[1613]: time="2026-01-27T05:56:43.014429511Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:43.014913 kubelet[2857]: E0127 05:56:43.014729 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:56:43.014913 kubelet[2857]: E0127 05:56:43.014795 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:56:43.015720 kubelet[2857]: E0127 05:56:43.014950 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aaa555957aa46189e6920dc1ced80c6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:43.018068 containerd[1613]: time="2026-01-27T05:56:43.017763076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 05:56:43.185518 containerd[1613]: time="2026-01-27T05:56:43.185330693Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:43.187438 containerd[1613]: time="2026-01-27T05:56:43.187376832Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 05:56:43.187789 containerd[1613]: time="2026-01-27T05:56:43.187386514Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:43.187906 kubelet[2857]: E0127 05:56:43.187720 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:56:43.187906 kubelet[2857]: E0127 05:56:43.187791 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:56:43.188349 kubelet[2857]: E0127 05:56:43.188274 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:43.190037 kubelet[2857]: E0127 05:56:43.189876 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:56:45.851947 containerd[1613]: time="2026-01-27T05:56:45.851895992Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 05:56:46.012891 containerd[1613]: time="2026-01-27T05:56:46.012826946Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:46.014584 containerd[1613]: time="2026-01-27T05:56:46.014465405Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 05:56:46.014584 containerd[1613]: time="2026-01-27T05:56:46.014513296Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:46.014898 kubelet[2857]: E0127 05:56:46.014845 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:46.015880 kubelet[2857]: E0127 05:56:46.014916 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:46.015880 kubelet[2857]: E0127 05:56:46.015090 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwtvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-848cb947c-w2cj4_calico-apiserver(124b40a0-d1a3-4e06-b27d-7331549e3e87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:46.016425 kubelet[2857]: E0127 05:56:46.016333 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:56:47.851832 containerd[1613]: time="2026-01-27T05:56:47.851768871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 05:56:48.008116 containerd[1613]: time="2026-01-27T05:56:48.008040196Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:48.009632 containerd[1613]: time="2026-01-27T05:56:48.009573872Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 05:56:48.009832 containerd[1613]: time="2026-01-27T05:56:48.009697041Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:48.009920 kubelet[2857]: E0127 05:56:48.009870 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:56:48.010469 kubelet[2857]: E0127 05:56:48.009937 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:56:48.010469 kubelet[2857]: E0127 05:56:48.010156 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h46mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56c68bdb6-v2r5d_calico-system(abcbc1d1-af63-4772-a8eb-6b5783d69e07): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:48.011669 kubelet[2857]: E0127 05:56:48.011577 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:56:48.852983 containerd[1613]: time="2026-01-27T05:56:48.852560237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 05:56:49.014832 containerd[1613]: time="2026-01-27T05:56:49.014762511Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:49.016528 containerd[1613]: time="2026-01-27T05:56:49.016470980Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 05:56:49.016528 containerd[1613]: time="2026-01-27T05:56:49.016483555Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:49.016935 kubelet[2857]: E0127 05:56:49.016757 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:49.016935 kubelet[2857]: E0127 05:56:49.016821 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:56:49.017778 kubelet[2857]: E0127 05:56:49.017594 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jpxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-848cb947c-xgb9b_calico-apiserver(e4b3467b-adcb-4738-9feb-ff8bcf1c33fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:49.018428 containerd[1613]: time="2026-01-27T05:56:49.017257465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 05:56:49.020041 kubelet[2857]: E0127 05:56:49.019752 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:56:49.179052 containerd[1613]: time="2026-01-27T05:56:49.178875107Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:49.180538 containerd[1613]: time="2026-01-27T05:56:49.180457830Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 05:56:49.180794 containerd[1613]: time="2026-01-27T05:56:49.180499148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:49.181029 kubelet[2857]: E0127 05:56:49.180948 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 05:56:49.181029 kubelet[2857]: E0127 05:56:49.181023 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 05:56:49.181328 kubelet[2857]: E0127 05:56:49.181214 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4nzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4mkcq_calico-system(13975170-e4d2-41c8-9e0a-f42e4f517791): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:49.182978 kubelet[2857]: E0127 05:56:49.182882 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:56:49.851282 containerd[1613]: time="2026-01-27T05:56:49.851235495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 05:56:50.002875 containerd[1613]: time="2026-01-27T05:56:50.002793060Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:50.004899 containerd[1613]: time="2026-01-27T05:56:50.004752336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 05:56:50.005290 containerd[1613]: time="2026-01-27T05:56:50.004795953Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:50.005531 kubelet[2857]: E0127 05:56:50.005495 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:56:50.005821 kubelet[2857]: E0127 05:56:50.005549 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:56:50.005821 kubelet[2857]: E0127 05:56:50.005703 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:50.009088 containerd[1613]: time="2026-01-27T05:56:50.009044111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 05:56:50.162384 containerd[1613]: time="2026-01-27T05:56:50.162194701Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:56:50.163836 containerd[1613]: time="2026-01-27T05:56:50.163750952Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 05:56:50.163836 containerd[1613]: time="2026-01-27T05:56:50.163816393Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 05:56:50.164411 kubelet[2857]: E0127 05:56:50.164336 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:56:50.165781 kubelet[2857]: E0127 05:56:50.164431 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:56:50.166155 kubelet[2857]: E0127 05:56:50.165964 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 05:56:50.167582 kubelet[2857]: E0127 05:56:50.167443 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:56:53.853906 kubelet[2857]: E0127 05:56:53.852604 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:56:55.909480 systemd[1]: Started sshd@9-10.128.0.23:22-4.153.228.146:50154.service - OpenSSH per-connection server daemon (4.153.228.146:50154). Jan 27 05:56:55.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.128.0.23:22-4.153.228.146:50154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:56:55.918121 kernel: kauditd_printk_skb: 239 callbacks suppressed Jan 27 05:56:55.918197 kernel: audit: type=1130 audit(1769493415.909:728): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.128.0.23:22-4.153.228.146:50154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:56:56.152000 audit[4986]: USER_ACCT pid=4986 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.154166 sshd[4986]: Accepted publickey for core from 4.153.228.146 port 50154 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:56:56.158624 sshd-session[4986]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:56:56.179436 systemd-logind[1576]: New session 11 of user core. Jan 27 05:56:56.185520 kernel: audit: type=1101 audit(1769493416.152:729): pid=4986 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.185644 kernel: audit: type=1103 audit(1769493416.155:730): pid=4986 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.155000 audit[4986]: CRED_ACQ pid=4986 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.227786 kernel: audit: type=1006 audit(1769493416.155:731): pid=4986 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 27 05:56:56.228402 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 27 05:56:56.155000 audit[4986]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe280ceb60 a2=3 a3=0 items=0 ppid=1 pid=4986 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:56.155000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:56:56.269965 kernel: audit: type=1300 audit(1769493416.155:731): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe280ceb60 a2=3 a3=0 items=0 ppid=1 pid=4986 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:56:56.270071 kernel: audit: type=1327 audit(1769493416.155:731): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:56:56.239000 audit[4986]: USER_START pid=4986 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.307962 kernel: audit: type=1105 audit(1769493416.239:732): pid=4986 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.308115 kernel: audit: type=1103 audit(1769493416.246:733): pid=4990 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.246000 audit[4990]: CRED_ACQ pid=4990 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.466114 sshd[4990]: Connection closed by 4.153.228.146 port 50154 Jan 27 05:56:56.467659 sshd-session[4986]: pam_unix(sshd:session): session closed for user core Jan 27 05:56:56.470000 audit[4986]: USER_END pid=4986 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.490063 systemd[1]: sshd@9-10.128.0.23:22-4.153.228.146:50154.service: Deactivated successfully. Jan 27 05:56:56.494939 systemd[1]: session-11.scope: Deactivated successfully. Jan 27 05:56:56.498705 systemd-logind[1576]: Session 11 logged out. Waiting for processes to exit. Jan 27 05:56:56.502556 systemd-logind[1576]: Removed session 11. Jan 27 05:56:56.470000 audit[4986]: CRED_DISP pid=4986 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.533899 kernel: audit: type=1106 audit(1769493416.470:734): pid=4986 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.533994 kernel: audit: type=1104 audit(1769493416.470:735): pid=4986 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:56:56.486000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.128.0.23:22-4.153.228.146:50154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:56:56.854659 kubelet[2857]: E0127 05:56:56.853449 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:57:00.852702 kubelet[2857]: E0127 05:57:00.852573 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:57:00.855874 kubelet[2857]: E0127 05:57:00.855827 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:57:01.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.128.0.23:22-4.153.228.146:50158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:01.513649 systemd[1]: Started sshd@10-10.128.0.23:22-4.153.228.146:50158.service - OpenSSH per-connection server daemon (4.153.228.146:50158). Jan 27 05:57:01.522736 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 05:57:01.522863 kernel: audit: type=1130 audit(1769493421.512:737): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.128.0.23:22-4.153.228.146:50158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:01.763000 audit[5028]: USER_ACCT pid=5028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:01.766147 sshd[5028]: Accepted publickey for core from 4.153.228.146 port 50158 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:01.770709 sshd-session[5028]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:01.785748 systemd-logind[1576]: New session 12 of user core. Jan 27 05:57:01.767000 audit[5028]: CRED_ACQ pid=5028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:01.796403 kernel: audit: type=1101 audit(1769493421.763:738): pid=5028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:01.796471 kernel: audit: type=1103 audit(1769493421.767:739): pid=5028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:01.822760 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 27 05:57:01.839232 kernel: audit: type=1006 audit(1769493421.767:740): pid=5028 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 27 05:57:01.767000 audit[5028]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3a452e80 a2=3 a3=0 items=0 ppid=1 pid=5028 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:01.767000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:01.881048 kernel: audit: type=1300 audit(1769493421.767:740): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3a452e80 a2=3 a3=0 items=0 ppid=1 pid=5028 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:01.881313 kernel: audit: type=1327 audit(1769493421.767:740): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:01.881706 kernel: audit: type=1105 audit(1769493421.831:741): pid=5028 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:01.831000 audit[5028]: USER_START pid=5028 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:01.843000 audit[5032]: CRED_ACQ pid=5032 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:01.942070 kernel: audit: type=1103 audit(1769493421.843:742): pid=5032 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:02.030355 sshd[5032]: Connection closed by 4.153.228.146 port 50158 Jan 27 05:57:02.032529 sshd-session[5028]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:02.033000 audit[5028]: USER_END pid=5028 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:02.040013 systemd[1]: sshd@10-10.128.0.23:22-4.153.228.146:50158.service: Deactivated successfully. Jan 27 05:57:02.044282 systemd[1]: session-12.scope: Deactivated successfully. Jan 27 05:57:02.048928 systemd-logind[1576]: Session 12 logged out. Waiting for processes to exit. Jan 27 05:57:02.050732 systemd-logind[1576]: Removed session 12. Jan 27 05:57:02.072407 kernel: audit: type=1106 audit(1769493422.033:743): pid=5028 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:02.072553 kernel: audit: type=1104 audit(1769493422.033:744): pid=5028 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:02.033000 audit[5028]: CRED_DISP pid=5028 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:02.038000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.128.0.23:22-4.153.228.146:50158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:03.852011 kubelet[2857]: E0127 05:57:03.851946 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:57:04.855233 kubelet[2857]: E0127 05:57:04.855150 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:57:06.856704 containerd[1613]: time="2026-01-27T05:57:06.856337249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 05:57:07.012941 containerd[1613]: time="2026-01-27T05:57:07.012877351Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:07.014563 containerd[1613]: time="2026-01-27T05:57:07.014502267Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 05:57:07.014929 containerd[1613]: time="2026-01-27T05:57:07.014541758Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:07.015071 kubelet[2857]: E0127 05:57:07.014798 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:57:07.015071 kubelet[2857]: E0127 05:57:07.014880 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:57:07.016026 kubelet[2857]: E0127 05:57:07.015097 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aaa555957aa46189e6920dc1ced80c6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:07.017660 containerd[1613]: time="2026-01-27T05:57:07.017593617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 05:57:07.085450 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 05:57:07.085585 kernel: audit: type=1130 audit(1769493427.074:746): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.128.0.23:22-4.153.228.146:37748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:07.074000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.128.0.23:22-4.153.228.146:37748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:07.075377 systemd[1]: Started sshd@11-10.128.0.23:22-4.153.228.146:37748.service - OpenSSH per-connection server daemon (4.153.228.146:37748). Jan 27 05:57:07.216995 containerd[1613]: time="2026-01-27T05:57:07.216927182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:07.218391 containerd[1613]: time="2026-01-27T05:57:07.218299636Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 05:57:07.218537 containerd[1613]: time="2026-01-27T05:57:07.218454079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:07.218941 kubelet[2857]: E0127 05:57:07.218885 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:57:07.219079 kubelet[2857]: E0127 05:57:07.218954 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:57:07.219176 kubelet[2857]: E0127 05:57:07.219120 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:07.220591 kubelet[2857]: E0127 05:57:07.220526 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:57:07.315000 audit[5046]: USER_ACCT pid=5046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.317495 sshd[5046]: Accepted publickey for core from 4.153.228.146 port 37748 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:07.321036 sshd-session[5046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:07.331723 systemd-logind[1576]: New session 13 of user core. Jan 27 05:57:07.315000 audit[5046]: CRED_ACQ pid=5046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.374195 kernel: audit: type=1101 audit(1769493427.315:747): pid=5046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.374327 kernel: audit: type=1103 audit(1769493427.315:748): pid=5046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.374847 kernel: audit: type=1006 audit(1769493427.315:749): pid=5046 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 27 05:57:07.315000 audit[5046]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe7307f230 a2=3 a3=0 items=0 ppid=1 pid=5046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:07.391751 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 27 05:57:07.420407 kernel: audit: type=1300 audit(1769493427.315:749): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe7307f230 a2=3 a3=0 items=0 ppid=1 pid=5046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:07.315000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:07.399000 audit[5046]: USER_START pid=5046 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.433481 kernel: audit: type=1327 audit(1769493427.315:749): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:07.433538 kernel: audit: type=1105 audit(1769493427.399:750): pid=5046 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.399000 audit[5050]: CRED_ACQ pid=5050 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.493399 kernel: audit: type=1103 audit(1769493427.399:751): pid=5050 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.597500 sshd[5050]: Connection closed by 4.153.228.146 port 37748 Jan 27 05:57:07.598687 sshd-session[5046]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:07.599000 audit[5046]: USER_END pid=5046 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.606841 systemd[1]: sshd@11-10.128.0.23:22-4.153.228.146:37748.service: Deactivated successfully. Jan 27 05:57:07.607178 systemd-logind[1576]: Session 13 logged out. Waiting for processes to exit. Jan 27 05:57:07.612273 systemd[1]: session-13.scope: Deactivated successfully. Jan 27 05:57:07.618553 systemd-logind[1576]: Removed session 13. Jan 27 05:57:07.639436 kernel: audit: type=1106 audit(1769493427.599:752): pid=5046 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.639585 kernel: audit: type=1104 audit(1769493427.599:753): pid=5046 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.599000 audit[5046]: CRED_DISP pid=5046 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.605000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.128.0.23:22-4.153.228.146:37748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:07.681617 systemd[1]: Started sshd@12-10.128.0.23:22-4.153.228.146:37750.service - OpenSSH per-connection server daemon (4.153.228.146:37750). Jan 27 05:57:07.680000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.128.0.23:22-4.153.228.146:37750 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:07.952000 audit[5063]: USER_ACCT pid=5063 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.954527 sshd[5063]: Accepted publickey for core from 4.153.228.146 port 37750 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:07.954000 audit[5063]: CRED_ACQ pid=5063 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.954000 audit[5063]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc6eaa77f0 a2=3 a3=0 items=0 ppid=1 pid=5063 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:07.954000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:07.957397 sshd-session[5063]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:07.967553 systemd-logind[1576]: New session 14 of user core. Jan 27 05:57:07.972653 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 27 05:57:07.982000 audit[5063]: USER_START pid=5063 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:07.986000 audit[5067]: CRED_ACQ pid=5067 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.250228 sshd[5067]: Connection closed by 4.153.228.146 port 37750 Jan 27 05:57:08.251665 sshd-session[5063]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:08.256000 audit[5063]: USER_END pid=5063 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.257000 audit[5063]: CRED_DISP pid=5063 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.266077 systemd[1]: sshd@12-10.128.0.23:22-4.153.228.146:37750.service: Deactivated successfully. Jan 27 05:57:08.266920 systemd-logind[1576]: Session 14 logged out. Waiting for processes to exit. Jan 27 05:57:08.266000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.128.0.23:22-4.153.228.146:37750 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:08.272550 systemd[1]: session-14.scope: Deactivated successfully. Jan 27 05:57:08.282945 systemd-logind[1576]: Removed session 14. Jan 27 05:57:08.298000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.128.0.23:22-4.153.228.146:37762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:08.299680 systemd[1]: Started sshd@13-10.128.0.23:22-4.153.228.146:37762.service - OpenSSH per-connection server daemon (4.153.228.146:37762). Jan 27 05:57:08.545000 audit[5077]: USER_ACCT pid=5077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.549477 sshd[5077]: Accepted publickey for core from 4.153.228.146 port 37762 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:08.550000 audit[5077]: CRED_ACQ pid=5077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.550000 audit[5077]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd7c1ddfb0 a2=3 a3=0 items=0 ppid=1 pid=5077 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:08.550000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:08.554732 sshd-session[5077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:08.568661 systemd-logind[1576]: New session 15 of user core. Jan 27 05:57:08.576660 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 27 05:57:08.582000 audit[5077]: USER_START pid=5077 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.587000 audit[5081]: CRED_ACQ pid=5081 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.834486 sshd[5081]: Connection closed by 4.153.228.146 port 37762 Jan 27 05:57:08.836619 sshd-session[5077]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:08.839000 audit[5077]: USER_END pid=5077 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.839000 audit[5077]: CRED_DISP pid=5077 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:08.846771 systemd[1]: sshd@13-10.128.0.23:22-4.153.228.146:37762.service: Deactivated successfully. Jan 27 05:57:08.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.128.0.23:22-4.153.228.146:37762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:08.853466 systemd[1]: session-15.scope: Deactivated successfully. Jan 27 05:57:08.859072 systemd-logind[1576]: Session 15 logged out. Waiting for processes to exit. Jan 27 05:57:08.861931 systemd-logind[1576]: Removed session 15. Jan 27 05:57:11.851898 containerd[1613]: time="2026-01-27T05:57:11.851822319Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 05:57:12.009610 containerd[1613]: time="2026-01-27T05:57:12.009542744Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:12.011173 containerd[1613]: time="2026-01-27T05:57:12.011028699Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 05:57:12.011173 containerd[1613]: time="2026-01-27T05:57:12.011090226Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:12.011461 kubelet[2857]: E0127 05:57:12.011405 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:57:12.012138 kubelet[2857]: E0127 05:57:12.011476 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:57:12.012138 kubelet[2857]: E0127 05:57:12.011994 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h46mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56c68bdb6-v2r5d_calico-system(abcbc1d1-af63-4772-a8eb-6b5783d69e07): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:12.013124 containerd[1613]: time="2026-01-27T05:57:12.013047002Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 05:57:12.013388 kubelet[2857]: E0127 05:57:12.013243 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:57:12.176714 containerd[1613]: time="2026-01-27T05:57:12.176544986Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:12.178030 containerd[1613]: time="2026-01-27T05:57:12.177964289Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 05:57:12.178193 containerd[1613]: time="2026-01-27T05:57:12.178079376Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:12.178325 kubelet[2857]: E0127 05:57:12.178273 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:57:12.178506 kubelet[2857]: E0127 05:57:12.178343 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:57:12.178641 kubelet[2857]: E0127 05:57:12.178551 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwtvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-848cb947c-w2cj4_calico-apiserver(124b40a0-d1a3-4e06-b27d-7331549e3e87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:12.180181 kubelet[2857]: E0127 05:57:12.180123 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:57:12.853219 containerd[1613]: time="2026-01-27T05:57:12.853047158Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 05:57:13.008230 containerd[1613]: time="2026-01-27T05:57:13.008156916Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:13.009785 containerd[1613]: time="2026-01-27T05:57:13.009708330Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:13.010001 containerd[1613]: time="2026-01-27T05:57:13.009713439Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 05:57:13.010463 kubelet[2857]: E0127 05:57:13.010305 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 05:57:13.010463 kubelet[2857]: E0127 05:57:13.010393 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 05:57:13.011205 kubelet[2857]: E0127 05:57:13.010592 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4nzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4mkcq_calico-system(13975170-e4d2-41c8-9e0a-f42e4f517791): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:13.012057 kubelet[2857]: E0127 05:57:13.011854 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:57:13.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.128.0.23:22-4.153.228.146:37778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:13.878711 systemd[1]: Started sshd@14-10.128.0.23:22-4.153.228.146:37778.service - OpenSSH per-connection server daemon (4.153.228.146:37778). Jan 27 05:57:13.888301 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 27 05:57:13.888470 kernel: audit: type=1130 audit(1769493433.877:773): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.128.0.23:22-4.153.228.146:37778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:14.112000 audit[5104]: USER_ACCT pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.117086 sshd-session[5104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:14.118569 sshd[5104]: Accepted publickey for core from 4.153.228.146 port 37778 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:14.127301 systemd-logind[1576]: New session 16 of user core. Jan 27 05:57:14.114000 audit[5104]: CRED_ACQ pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.171189 kernel: audit: type=1101 audit(1769493434.112:774): pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.171348 kernel: audit: type=1103 audit(1769493434.114:775): pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.172437 kernel: audit: type=1006 audit(1769493434.114:776): pid=5104 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 27 05:57:14.173480 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 27 05:57:14.114000 audit[5104]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe239ac1b0 a2=3 a3=0 items=0 ppid=1 pid=5104 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:14.219418 kernel: audit: type=1300 audit(1769493434.114:776): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe239ac1b0 a2=3 a3=0 items=0 ppid=1 pid=5104 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:14.114000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:14.183000 audit[5104]: USER_START pid=5104 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.266096 kernel: audit: type=1327 audit(1769493434.114:776): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:14.266206 kernel: audit: type=1105 audit(1769493434.183:777): pid=5104 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.193000 audit[5108]: CRED_ACQ pid=5108 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.291480 kernel: audit: type=1103 audit(1769493434.193:778): pid=5108 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.381394 sshd[5108]: Connection closed by 4.153.228.146 port 37778 Jan 27 05:57:14.381760 sshd-session[5104]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:14.383000 audit[5104]: USER_END pid=5104 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.394185 systemd[1]: sshd@14-10.128.0.23:22-4.153.228.146:37778.service: Deactivated successfully. Jan 27 05:57:14.398440 systemd[1]: session-16.scope: Deactivated successfully. Jan 27 05:57:14.406138 systemd-logind[1576]: Session 16 logged out. Waiting for processes to exit. Jan 27 05:57:14.408694 systemd-logind[1576]: Removed session 16. Jan 27 05:57:14.383000 audit[5104]: CRED_DISP pid=5104 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.447130 kernel: audit: type=1106 audit(1769493434.383:779): pid=5104 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.447304 kernel: audit: type=1104 audit(1769493434.383:780): pid=5104 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:14.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.128.0.23:22-4.153.228.146:37778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:16.857934 containerd[1613]: time="2026-01-27T05:57:16.855655175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 05:57:17.009109 containerd[1613]: time="2026-01-27T05:57:17.009032202Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:17.010769 containerd[1613]: time="2026-01-27T05:57:17.010708570Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 05:57:17.010973 containerd[1613]: time="2026-01-27T05:57:17.010831829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:17.011190 kubelet[2857]: E0127 05:57:17.011133 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:57:17.011714 kubelet[2857]: E0127 05:57:17.011215 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:57:17.011991 kubelet[2857]: E0127 05:57:17.011920 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:17.015568 containerd[1613]: time="2026-01-27T05:57:17.015511006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 05:57:17.170085 containerd[1613]: time="2026-01-27T05:57:17.169908129Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:17.171437 containerd[1613]: time="2026-01-27T05:57:17.171351718Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 05:57:17.171858 containerd[1613]: time="2026-01-27T05:57:17.171499867Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:17.171980 kubelet[2857]: E0127 05:57:17.171684 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:57:17.171980 kubelet[2857]: E0127 05:57:17.171748 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:57:17.172414 kubelet[2857]: E0127 05:57:17.172315 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:17.173864 kubelet[2857]: E0127 05:57:17.173803 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:57:19.435919 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 05:57:19.436024 kernel: audit: type=1130 audit(1769493439.427:782): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.128.0.23:22-4.153.228.146:59644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:19.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.128.0.23:22-4.153.228.146:59644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:19.429001 systemd[1]: Started sshd@15-10.128.0.23:22-4.153.228.146:59644.service - OpenSSH per-connection server daemon (4.153.228.146:59644). Jan 27 05:57:19.666000 audit[5122]: USER_ACCT pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.671020 sshd-session[5122]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:19.671904 sshd[5122]: Accepted publickey for core from 4.153.228.146 port 59644 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:19.683852 systemd-logind[1576]: New session 17 of user core. Jan 27 05:57:19.700109 kernel: audit: type=1101 audit(1769493439.666:783): pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.700240 kernel: audit: type=1103 audit(1769493439.668:784): pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.668000 audit[5122]: CRED_ACQ pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.725862 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 27 05:57:19.741023 kernel: audit: type=1006 audit(1769493439.668:785): pid=5122 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 27 05:57:19.742218 kernel: audit: type=1300 audit(1769493439.668:785): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe5de456b0 a2=3 a3=0 items=0 ppid=1 pid=5122 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:19.668000 audit[5122]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe5de456b0 a2=3 a3=0 items=0 ppid=1 pid=5122 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:19.668000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:19.745000 audit[5122]: USER_START pid=5122 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.820642 kernel: audit: type=1327 audit(1769493439.668:785): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:19.820751 kernel: audit: type=1105 audit(1769493439.745:786): pid=5122 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.757000 audit[5126]: CRED_ACQ pid=5126 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.846343 kernel: audit: type=1103 audit(1769493439.757:787): pid=5126 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.854930 containerd[1613]: time="2026-01-27T05:57:19.854887760Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 05:57:19.955093 sshd[5126]: Connection closed by 4.153.228.146 port 59644 Jan 27 05:57:19.956545 sshd-session[5122]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:19.958000 audit[5122]: USER_END pid=5122 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.996388 kernel: audit: type=1106 audit(1769493439.958:788): pid=5122 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.958000 audit[5122]: CRED_DISP pid=5122 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:19.999847 systemd-logind[1576]: Session 17 logged out. Waiting for processes to exit. Jan 27 05:57:20.000264 systemd[1]: sshd@15-10.128.0.23:22-4.153.228.146:59644.service: Deactivated successfully. Jan 27 05:57:20.005460 systemd[1]: session-17.scope: Deactivated successfully. Jan 27 05:57:20.011750 systemd-logind[1576]: Removed session 17. Jan 27 05:57:19.999000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.128.0.23:22-4.153.228.146:59644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:20.025401 kernel: audit: type=1104 audit(1769493439.958:789): pid=5122 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:20.033627 containerd[1613]: time="2026-01-27T05:57:20.033570019Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:20.035390 containerd[1613]: time="2026-01-27T05:57:20.035281282Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 05:57:20.035390 containerd[1613]: time="2026-01-27T05:57:20.035337588Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:20.035757 kubelet[2857]: E0127 05:57:20.035700 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:57:20.036245 kubelet[2857]: E0127 05:57:20.035763 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:57:20.036762 kubelet[2857]: E0127 05:57:20.036011 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jpxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-848cb947c-xgb9b_calico-apiserver(e4b3467b-adcb-4738-9feb-ff8bcf1c33fe): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:20.038218 kubelet[2857]: E0127 05:57:20.038161 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:57:21.853072 kubelet[2857]: E0127 05:57:21.852988 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:57:25.000000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.128.0.23:22-4.153.228.146:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:24.999970 systemd[1]: Started sshd@16-10.128.0.23:22-4.153.228.146:59348.service - OpenSSH per-connection server daemon (4.153.228.146:59348). Jan 27 05:57:25.006978 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 05:57:25.007061 kernel: audit: type=1130 audit(1769493445.000:791): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.128.0.23:22-4.153.228.146:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:25.243000 audit[5139]: USER_ACCT pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.245744 sshd[5139]: Accepted publickey for core from 4.153.228.146 port 59348 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:25.248451 sshd-session[5139]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:25.263509 systemd-logind[1576]: New session 18 of user core. Jan 27 05:57:25.245000 audit[5139]: CRED_ACQ pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.301765 kernel: audit: type=1101 audit(1769493445.243:792): pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.301878 kernel: audit: type=1103 audit(1769493445.245:793): pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.318128 kernel: audit: type=1006 audit(1769493445.245:794): pid=5139 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 27 05:57:25.245000 audit[5139]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdc24b1f00 a2=3 a3=0 items=0 ppid=1 pid=5139 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:25.319141 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 27 05:57:25.348177 kernel: audit: type=1300 audit(1769493445.245:794): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdc24b1f00 a2=3 a3=0 items=0 ppid=1 pid=5139 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:25.348608 kernel: audit: type=1327 audit(1769493445.245:794): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:25.245000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:25.333000 audit[5139]: USER_START pid=5139 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.394646 kernel: audit: type=1105 audit(1769493445.333:795): pid=5139 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.395473 kernel: audit: type=1103 audit(1769493445.337:796): pid=5143 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.337000 audit[5143]: CRED_ACQ pid=5143 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.513951 sshd[5143]: Connection closed by 4.153.228.146 port 59348 Jan 27 05:57:25.515153 sshd-session[5139]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:25.516000 audit[5139]: USER_END pid=5139 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.524925 systemd[1]: sshd@16-10.128.0.23:22-4.153.228.146:59348.service: Deactivated successfully. Jan 27 05:57:25.525268 systemd-logind[1576]: Session 18 logged out. Waiting for processes to exit. Jan 27 05:57:25.529974 systemd[1]: session-18.scope: Deactivated successfully. Jan 27 05:57:25.536532 systemd-logind[1576]: Removed session 18. Jan 27 05:57:25.553414 kernel: audit: type=1106 audit(1769493445.516:797): pid=5139 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.516000 audit[5139]: CRED_DISP pid=5139 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.522000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.128.0.23:22-4.153.228.146:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:25.579413 kernel: audit: type=1104 audit(1769493445.516:798): pid=5139 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:25.852812 kubelet[2857]: E0127 05:57:25.851537 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:57:26.852541 kubelet[2857]: E0127 05:57:26.852444 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:57:26.853311 kubelet[2857]: E0127 05:57:26.853244 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:57:29.854584 kubelet[2857]: E0127 05:57:29.854466 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:57:30.559049 systemd[1]: Started sshd@17-10.128.0.23:22-4.153.228.146:59352.service - OpenSSH per-connection server daemon (4.153.228.146:59352). Jan 27 05:57:30.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.128.0.23:22-4.153.228.146:59352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:30.566499 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 05:57:30.566591 kernel: audit: type=1130 audit(1769493450.558:800): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.128.0.23:22-4.153.228.146:59352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:30.797000 audit[5182]: USER_ACCT pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:30.801722 sshd-session[5182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:30.805025 sshd[5182]: Accepted publickey for core from 4.153.228.146 port 59352 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:30.812911 systemd-logind[1576]: New session 19 of user core. Jan 27 05:57:30.831409 kernel: audit: type=1101 audit(1769493450.797:801): pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:30.831534 kernel: audit: type=1103 audit(1769493450.797:802): pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:30.797000 audit[5182]: CRED_ACQ pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:30.871702 kernel: audit: type=1006 audit(1769493450.797:803): pid=5182 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 27 05:57:30.797000 audit[5182]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff5d9d3e10 a2=3 a3=0 items=0 ppid=1 pid=5182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:30.883173 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 27 05:57:30.905704 kernel: audit: type=1300 audit(1769493450.797:803): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff5d9d3e10 a2=3 a3=0 items=0 ppid=1 pid=5182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:30.905790 kernel: audit: type=1327 audit(1769493450.797:803): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:30.797000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:30.887000 audit[5182]: USER_START pid=5182 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:30.950415 kernel: audit: type=1105 audit(1769493450.887:804): pid=5182 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:30.903000 audit[5186]: CRED_ACQ pid=5186 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:30.977459 kernel: audit: type=1103 audit(1769493450.903:805): pid=5186 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.075681 sshd[5186]: Connection closed by 4.153.228.146 port 59352 Jan 27 05:57:31.077646 sshd-session[5182]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:31.079000 audit[5182]: USER_END pid=5182 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.117776 kernel: audit: type=1106 audit(1769493451.079:806): pid=5182 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.079000 audit[5182]: CRED_DISP pid=5182 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.143670 kernel: audit: type=1104 audit(1769493451.079:807): pid=5182 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.153683 systemd[1]: sshd@17-10.128.0.23:22-4.153.228.146:59352.service: Deactivated successfully. Jan 27 05:57:31.152000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.128.0.23:22-4.153.228.146:59352 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:31.157035 systemd[1]: session-19.scope: Deactivated successfully. Jan 27 05:57:31.158728 systemd-logind[1576]: Session 19 logged out. Waiting for processes to exit. Jan 27 05:57:31.164510 systemd[1]: Started sshd@18-10.128.0.23:22-4.153.228.146:59362.service - OpenSSH per-connection server daemon (4.153.228.146:59362). Jan 27 05:57:31.163000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.128.0.23:22-4.153.228.146:59362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:31.166415 systemd-logind[1576]: Removed session 19. Jan 27 05:57:31.389000 audit[5198]: USER_ACCT pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.390975 sshd[5198]: Accepted publickey for core from 4.153.228.146 port 59362 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:31.391000 audit[5198]: CRED_ACQ pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.391000 audit[5198]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff1b431890 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:31.391000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:31.393880 sshd-session[5198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:31.401437 systemd-logind[1576]: New session 20 of user core. Jan 27 05:57:31.406634 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 27 05:57:31.410000 audit[5198]: USER_START pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.412000 audit[5202]: CRED_ACQ pid=5202 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.622749 sshd[5202]: Connection closed by 4.153.228.146 port 59362 Jan 27 05:57:31.623705 sshd-session[5198]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:31.624000 audit[5198]: USER_END pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.624000 audit[5198]: CRED_DISP pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.629842 systemd[1]: sshd@18-10.128.0.23:22-4.153.228.146:59362.service: Deactivated successfully. Jan 27 05:57:31.628000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.128.0.23:22-4.153.228.146:59362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:31.633088 systemd[1]: session-20.scope: Deactivated successfully. Jan 27 05:57:31.636796 systemd-logind[1576]: Session 20 logged out. Waiting for processes to exit. Jan 27 05:57:31.638313 systemd-logind[1576]: Removed session 20. Jan 27 05:57:31.669000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.128.0.23:22-4.153.228.146:59374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:31.670951 systemd[1]: Started sshd@19-10.128.0.23:22-4.153.228.146:59374.service - OpenSSH per-connection server daemon (4.153.228.146:59374). Jan 27 05:57:31.852025 kubelet[2857]: E0127 05:57:31.851852 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:57:31.893000 audit[5211]: USER_ACCT pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.894978 sshd[5211]: Accepted publickey for core from 4.153.228.146 port 59374 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:31.894000 audit[5211]: CRED_ACQ pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.894000 audit[5211]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd35438090 a2=3 a3=0 items=0 ppid=1 pid=5211 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:31.894000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:31.897472 sshd-session[5211]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:31.904265 systemd-logind[1576]: New session 21 of user core. Jan 27 05:57:31.910662 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 27 05:57:31.917000 audit[5211]: USER_START pid=5211 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:31.920000 audit[5215]: CRED_ACQ pid=5215 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:32.671000 audit[5225]: NETFILTER_CFG table=filter:144 family=2 entries=26 op=nft_register_rule pid=5225 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:57:32.671000 audit[5225]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff81e5b8d0 a2=0 a3=7fff81e5b8bc items=0 ppid=2979 pid=5225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:32.671000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:57:32.679000 audit[5225]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5225 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:57:32.679000 audit[5225]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff81e5b8d0 a2=0 a3=0 items=0 ppid=2979 pid=5225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:32.679000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:57:32.711949 sshd[5215]: Connection closed by 4.153.228.146 port 59374 Jan 27 05:57:32.713499 sshd-session[5211]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:32.716000 audit[5227]: NETFILTER_CFG table=filter:146 family=2 entries=38 op=nft_register_rule pid=5227 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:57:32.716000 audit[5227]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffea3fdb8a0 a2=0 a3=7ffea3fdb88c items=0 ppid=2979 pid=5227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:32.716000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:57:32.717000 audit[5211]: USER_END pid=5211 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:32.718000 audit[5211]: CRED_DISP pid=5211 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:32.722000 audit[5227]: NETFILTER_CFG table=nat:147 family=2 entries=20 op=nft_register_rule pid=5227 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:57:32.722000 audit[5227]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffea3fdb8a0 a2=0 a3=0 items=0 ppid=2979 pid=5227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:32.724811 systemd[1]: sshd@19-10.128.0.23:22-4.153.228.146:59374.service: Deactivated successfully. Jan 27 05:57:32.722000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:57:32.724000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.128.0.23:22-4.153.228.146:59374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:32.728150 systemd[1]: session-21.scope: Deactivated successfully. Jan 27 05:57:32.730664 systemd-logind[1576]: Session 21 logged out. Waiting for processes to exit. Jan 27 05:57:32.733223 systemd-logind[1576]: Removed session 21. Jan 27 05:57:32.758885 systemd[1]: Started sshd@20-10.128.0.23:22-4.153.228.146:59384.service - OpenSSH per-connection server daemon (4.153.228.146:59384). Jan 27 05:57:32.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.128.0.23:22-4.153.228.146:59384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:32.996000 audit[5232]: USER_ACCT pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:32.998026 sshd[5232]: Accepted publickey for core from 4.153.228.146 port 59384 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:32.997000 audit[5232]: CRED_ACQ pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:32.997000 audit[5232]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc02447ab0 a2=3 a3=0 items=0 ppid=1 pid=5232 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:32.997000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:33.000280 sshd-session[5232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:33.011612 systemd-logind[1576]: New session 22 of user core. Jan 27 05:57:33.015653 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 27 05:57:33.019000 audit[5232]: USER_START pid=5232 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.022000 audit[5236]: CRED_ACQ pid=5236 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.352272 sshd[5236]: Connection closed by 4.153.228.146 port 59384 Jan 27 05:57:33.352784 sshd-session[5232]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:33.355000 audit[5232]: USER_END pid=5232 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.355000 audit[5232]: CRED_DISP pid=5232 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.360844 systemd[1]: sshd@20-10.128.0.23:22-4.153.228.146:59384.service: Deactivated successfully. Jan 27 05:57:33.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.128.0.23:22-4.153.228.146:59384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:33.364119 systemd[1]: session-22.scope: Deactivated successfully. Jan 27 05:57:33.365942 systemd-logind[1576]: Session 22 logged out. Waiting for processes to exit. Jan 27 05:57:33.368333 systemd-logind[1576]: Removed session 22. Jan 27 05:57:33.393000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.128.0.23:22-4.153.228.146:59400 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:33.394587 systemd[1]: Started sshd@21-10.128.0.23:22-4.153.228.146:59400.service - OpenSSH per-connection server daemon (4.153.228.146:59400). Jan 27 05:57:33.615000 audit[5247]: USER_ACCT pid=5247 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.617776 sshd[5247]: Accepted publickey for core from 4.153.228.146 port 59400 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:33.617000 audit[5247]: CRED_ACQ pid=5247 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.618000 audit[5247]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff842c2220 a2=3 a3=0 items=0 ppid=1 pid=5247 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:33.618000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:33.620534 sshd-session[5247]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:33.628327 systemd-logind[1576]: New session 23 of user core. Jan 27 05:57:33.638610 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 27 05:57:33.642000 audit[5247]: USER_START pid=5247 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.644000 audit[5251]: CRED_ACQ pid=5251 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.786318 sshd[5251]: Connection closed by 4.153.228.146 port 59400 Jan 27 05:57:33.787687 sshd-session[5247]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:33.788000 audit[5247]: USER_END pid=5247 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.788000 audit[5247]: CRED_DISP pid=5247 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:33.794442 systemd[1]: sshd@21-10.128.0.23:22-4.153.228.146:59400.service: Deactivated successfully. Jan 27 05:57:33.793000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.128.0.23:22-4.153.228.146:59400 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:33.798179 systemd[1]: session-23.scope: Deactivated successfully. Jan 27 05:57:33.800319 systemd-logind[1576]: Session 23 logged out. Waiting for processes to exit. Jan 27 05:57:33.802433 systemd-logind[1576]: Removed session 23. Jan 27 05:57:34.854658 kubelet[2857]: E0127 05:57:34.854476 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:57:36.853414 kubelet[2857]: E0127 05:57:36.853183 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:57:37.852079 kubelet[2857]: E0127 05:57:37.851544 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:57:38.830972 systemd[1]: Started sshd@22-10.128.0.23:22-4.153.228.146:45574.service - OpenSSH per-connection server daemon (4.153.228.146:45574). Jan 27 05:57:38.830000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.128.0.23:22-4.153.228.146:45574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:38.839425 kernel: kauditd_printk_skb: 57 callbacks suppressed Jan 27 05:57:38.839536 kernel: audit: type=1130 audit(1769493458.830:849): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.128.0.23:22-4.153.228.146:45574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:39.072000 audit[5263]: USER_ACCT pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.076906 sshd-session[5263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:39.078352 sshd[5263]: Accepted publickey for core from 4.153.228.146 port 45574 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:39.089731 systemd-logind[1576]: New session 24 of user core. Jan 27 05:57:39.104510 kernel: audit: type=1101 audit(1769493459.072:850): pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.104620 kernel: audit: type=1103 audit(1769493459.072:851): pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.072000 audit[5263]: CRED_ACQ pid=5263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.131401 kernel: audit: type=1006 audit(1769493459.072:852): pid=5263 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 27 05:57:39.072000 audit[5263]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8aaeb2a0 a2=3 a3=0 items=0 ppid=1 pid=5263 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:39.176628 kernel: audit: type=1300 audit(1769493459.072:852): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8aaeb2a0 a2=3 a3=0 items=0 ppid=1 pid=5263 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:39.072000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:39.177787 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 27 05:57:39.187420 kernel: audit: type=1327 audit(1769493459.072:852): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:39.187509 kernel: audit: type=1105 audit(1769493459.185:853): pid=5263 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.185000 audit[5263]: USER_START pid=5263 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.248405 kernel: audit: type=1103 audit(1769493459.189:854): pid=5267 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.189000 audit[5267]: CRED_ACQ pid=5267 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.345342 sshd[5267]: Connection closed by 4.153.228.146 port 45574 Jan 27 05:57:39.347620 sshd-session[5263]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:39.348000 audit[5263]: USER_END pid=5263 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.354965 systemd[1]: sshd@22-10.128.0.23:22-4.153.228.146:45574.service: Deactivated successfully. Jan 27 05:57:39.359687 systemd[1]: session-24.scope: Deactivated successfully. Jan 27 05:57:39.362432 systemd-logind[1576]: Session 24 logged out. Waiting for processes to exit. Jan 27 05:57:39.365300 systemd-logind[1576]: Removed session 24. Jan 27 05:57:39.391169 kernel: audit: type=1106 audit(1769493459.348:855): pid=5263 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.391295 kernel: audit: type=1104 audit(1769493459.348:856): pid=5263 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.348000 audit[5263]: CRED_DISP pid=5263 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:39.354000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.128.0.23:22-4.153.228.146:45574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:39.781000 audit[5279]: NETFILTER_CFG table=filter:148 family=2 entries=26 op=nft_register_rule pid=5279 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:57:39.781000 audit[5279]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe70e68300 a2=0 a3=7ffe70e682ec items=0 ppid=2979 pid=5279 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:39.781000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:57:39.790000 audit[5279]: NETFILTER_CFG table=nat:149 family=2 entries=104 op=nft_register_chain pid=5279 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 05:57:39.790000 audit[5279]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffe70e68300 a2=0 a3=7ffe70e682ec items=0 ppid=2979 pid=5279 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:39.790000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 05:57:40.853604 kubelet[2857]: E0127 05:57:40.852989 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:57:42.854333 kubelet[2857]: E0127 05:57:42.854069 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b" Jan 27 05:57:44.392867 systemd[1]: Started sshd@23-10.128.0.23:22-4.153.228.146:45590.service - OpenSSH per-connection server daemon (4.153.228.146:45590). Jan 27 05:57:44.404214 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 27 05:57:44.404341 kernel: audit: type=1130 audit(1769493464.391:860): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.128.0.23:22-4.153.228.146:45590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:44.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.128.0.23:22-4.153.228.146:45590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:44.636000 audit[5281]: USER_ACCT pid=5281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.641118 sshd-session[5281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:44.642602 sshd[5281]: Accepted publickey for core from 4.153.228.146 port 45590 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:44.650840 systemd-logind[1576]: New session 25 of user core. Jan 27 05:57:44.638000 audit[5281]: CRED_ACQ pid=5281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.695320 kernel: audit: type=1101 audit(1769493464.636:861): pid=5281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.695629 kernel: audit: type=1103 audit(1769493464.638:862): pid=5281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.695680 kernel: audit: type=1006 audit(1769493464.638:863): pid=5281 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 27 05:57:44.638000 audit[5281]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff39798100 a2=3 a3=0 items=0 ppid=1 pid=5281 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:44.711754 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 27 05:57:44.741208 kernel: audit: type=1300 audit(1769493464.638:863): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff39798100 a2=3 a3=0 items=0 ppid=1 pid=5281 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:44.741326 kernel: audit: type=1327 audit(1769493464.638:863): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:44.638000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:44.751962 kernel: audit: type=1105 audit(1769493464.724:864): pid=5281 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.724000 audit[5281]: USER_START pid=5281 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.790714 kernel: audit: type=1103 audit(1769493464.732:865): pid=5285 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.732000 audit[5285]: CRED_ACQ pid=5285 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.949490 sshd[5285]: Connection closed by 4.153.228.146 port 45590 Jan 27 05:57:44.950631 sshd-session[5281]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:44.951000 audit[5281]: USER_END pid=5281 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.961247 systemd[1]: sshd@23-10.128.0.23:22-4.153.228.146:45590.service: Deactivated successfully. Jan 27 05:57:44.966247 systemd[1]: session-25.scope: Deactivated successfully. Jan 27 05:57:44.969995 systemd-logind[1576]: Session 25 logged out. Waiting for processes to exit. Jan 27 05:57:44.973153 systemd-logind[1576]: Removed session 25. Jan 27 05:57:44.951000 audit[5281]: CRED_DISP pid=5281 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:45.015481 kernel: audit: type=1106 audit(1769493464.951:866): pid=5281 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:45.015598 kernel: audit: type=1104 audit(1769493464.951:867): pid=5281 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:44.957000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.128.0.23:22-4.153.228.146:45590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:46.851479 kubelet[2857]: E0127 05:57:46.850865 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-xgb9b" podUID="e4b3467b-adcb-4738-9feb-ff8bcf1c33fe" Jan 27 05:57:48.019031 update_engine[1580]: I20260127 05:57:48.018951 1580 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 27 05:57:48.019031 update_engine[1580]: I20260127 05:57:48.019014 1580 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 27 05:57:48.019812 update_engine[1580]: I20260127 05:57:48.019279 1580 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 27 05:57:48.020151 update_engine[1580]: I20260127 05:57:48.020098 1580 omaha_request_params.cc:62] Current group set to developer Jan 27 05:57:48.020302 update_engine[1580]: I20260127 05:57:48.020272 1580 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 27 05:57:48.020383 update_engine[1580]: I20260127 05:57:48.020298 1580 update_attempter.cc:643] Scheduling an action processor start. Jan 27 05:57:48.020383 update_engine[1580]: I20260127 05:57:48.020325 1580 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 27 05:57:48.020486 update_engine[1580]: I20260127 05:57:48.020418 1580 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 27 05:57:48.020540 update_engine[1580]: I20260127 05:57:48.020516 1580 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 27 05:57:48.020540 update_engine[1580]: I20260127 05:57:48.020530 1580 omaha_request_action.cc:272] Request: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020540 update_engine[1580]: Jan 27 05:57:48.020968 update_engine[1580]: I20260127 05:57:48.020543 1580 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 27 05:57:48.022338 update_engine[1580]: I20260127 05:57:48.022233 1580 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 27 05:57:48.022555 locksmithd[1662]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 27 05:57:48.023300 update_engine[1580]: I20260127 05:57:48.023216 1580 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 27 05:57:48.069043 update_engine[1580]: E20260127 05:57:48.068969 1580 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 27 05:57:48.069176 update_engine[1580]: I20260127 05:57:48.069129 1580 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 27 05:57:48.853783 containerd[1613]: time="2026-01-27T05:57:48.852709467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 05:57:49.010964 containerd[1613]: time="2026-01-27T05:57:49.010886826Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:49.012465 containerd[1613]: time="2026-01-27T05:57:49.012383229Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:49.012999 containerd[1613]: time="2026-01-27T05:57:49.012475434Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 05:57:49.013106 kubelet[2857]: E0127 05:57:49.012855 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:57:49.013106 kubelet[2857]: E0127 05:57:49.012917 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 05:57:49.014016 kubelet[2857]: E0127 05:57:49.013163 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:4aaa555957aa46189e6920dc1ced80c6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:49.015788 containerd[1613]: time="2026-01-27T05:57:49.015720409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 05:57:49.171905 containerd[1613]: time="2026-01-27T05:57:49.171734479Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:49.173343 containerd[1613]: time="2026-01-27T05:57:49.173217104Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 05:57:49.173343 containerd[1613]: time="2026-01-27T05:57:49.173262452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:49.173806 kubelet[2857]: E0127 05:57:49.173756 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:57:49.173999 kubelet[2857]: E0127 05:57:49.173825 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 05:57:49.174093 kubelet[2857]: E0127 05:57:49.174004 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdl7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-8b7df678c-rdlpv_calico-system(2da61fce-ec56-409e-b213-304528221d28): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:49.175816 kubelet[2857]: E0127 05:57:49.175762 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-8b7df678c-rdlpv" podUID="2da61fce-ec56-409e-b213-304528221d28" Jan 27 05:57:49.851761 kubelet[2857]: E0127 05:57:49.851704 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4mkcq" podUID="13975170-e4d2-41c8-9e0a-f42e4f517791" Jan 27 05:57:49.998845 systemd[1]: Started sshd@24-10.128.0.23:22-4.153.228.146:41794.service - OpenSSH per-connection server daemon (4.153.228.146:41794). Jan 27 05:57:50.007439 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 05:57:50.007559 kernel: audit: type=1130 audit(1769493469.997:869): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.128.0.23:22-4.153.228.146:41794 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:49.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.128.0.23:22-4.153.228.146:41794 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:50.266000 audit[5301]: USER_ACCT pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.298127 sshd[5301]: Accepted publickey for core from 4.153.228.146 port 41794 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:50.298646 kernel: audit: type=1101 audit(1769493470.266:870): pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.301225 sshd-session[5301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:50.298000 audit[5301]: CRED_ACQ pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.330397 kernel: audit: type=1103 audit(1769493470.298:871): pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.298000 audit[5301]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc52bac6f0 a2=3 a3=0 items=0 ppid=1 pid=5301 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:50.376885 kernel: audit: type=1006 audit(1769493470.298:872): pid=5301 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 27 05:57:50.376991 kernel: audit: type=1300 audit(1769493470.298:872): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc52bac6f0 a2=3 a3=0 items=0 ppid=1 pid=5301 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:50.298000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:50.387216 systemd-logind[1576]: New session 26 of user core. Jan 27 05:57:50.390571 kernel: audit: type=1327 audit(1769493470.298:872): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:50.394285 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 27 05:57:50.399000 audit[5301]: USER_START pid=5301 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.437410 kernel: audit: type=1105 audit(1769493470.399:873): pid=5301 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.442000 audit[5312]: CRED_ACQ pid=5312 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.471438 kernel: audit: type=1103 audit(1769493470.442:874): pid=5312 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.664532 sshd[5312]: Connection closed by 4.153.228.146 port 41794 Jan 27 05:57:50.664281 sshd-session[5301]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:50.667000 audit[5301]: USER_END pid=5301 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.702000 audit[5301]: CRED_DISP pid=5301 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.712932 systemd[1]: sshd@24-10.128.0.23:22-4.153.228.146:41794.service: Deactivated successfully. Jan 27 05:57:50.726643 systemd[1]: session-26.scope: Deactivated successfully. Jan 27 05:57:50.730186 kernel: audit: type=1106 audit(1769493470.667:875): pid=5301 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.730993 kernel: audit: type=1104 audit(1769493470.702:876): pid=5301 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:50.714000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.128.0.23:22-4.153.228.146:41794 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:50.735266 systemd-logind[1576]: Session 26 logged out. Waiting for processes to exit. Jan 27 05:57:50.737180 systemd-logind[1576]: Removed session 26. Jan 27 05:57:52.855123 containerd[1613]: time="2026-01-27T05:57:52.854750808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 05:57:53.018637 containerd[1613]: time="2026-01-27T05:57:53.018525223Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:53.021061 containerd[1613]: time="2026-01-27T05:57:53.020904322Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 05:57:53.021061 containerd[1613]: time="2026-01-27T05:57:53.021022665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:53.021708 kubelet[2857]: E0127 05:57:53.021572 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:57:53.023050 kubelet[2857]: E0127 05:57:53.022434 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 05:57:53.023373 kubelet[2857]: E0127 05:57:53.023264 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwtvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-848cb947c-w2cj4_calico-apiserver(124b40a0-d1a3-4e06-b27d-7331549e3e87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:53.025001 kubelet[2857]: E0127 05:57:53.024924 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-848cb947c-w2cj4" podUID="124b40a0-d1a3-4e06-b27d-7331549e3e87" Jan 27 05:57:55.710000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.128.0.23:22-4.153.228.146:59708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:55.711791 systemd[1]: Started sshd@25-10.128.0.23:22-4.153.228.146:59708.service - OpenSSH per-connection server daemon (4.153.228.146:59708). Jan 27 05:57:55.717689 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 05:57:55.717798 kernel: audit: type=1130 audit(1769493475.710:878): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.128.0.23:22-4.153.228.146:59708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:55.855841 containerd[1613]: time="2026-01-27T05:57:55.855482986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 05:57:55.997594 sshd[5326]: Accepted publickey for core from 4.153.228.146 port 59708 ssh2: RSA SHA256:JN4BeFMUlbDxkzeIyaplo/K9bJRzUxsC6g1kswb3p80 Jan 27 05:57:55.995000 audit[5326]: USER_ACCT pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.001195 sshd-session[5326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 05:57:56.022985 containerd[1613]: time="2026-01-27T05:57:56.022937864Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:56.024902 containerd[1613]: time="2026-01-27T05:57:56.024852485Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 05:57:56.025132 containerd[1613]: time="2026-01-27T05:57:56.025053939Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:56.025547 kubelet[2857]: E0127 05:57:56.025494 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:57:56.027248 kubelet[2857]: E0127 05:57:56.026499 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 05:57:56.027248 kubelet[2857]: E0127 05:57:56.026746 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h46mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-56c68bdb6-v2r5d_calico-system(abcbc1d1-af63-4772-a8eb-6b5783d69e07): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:56.029387 kernel: audit: type=1101 audit(1769493475.995:879): pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_oslogin_admin,pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.029769 kubelet[2857]: E0127 05:57:56.029569 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-56c68bdb6-v2r5d" podUID="abcbc1d1-af63-4772-a8eb-6b5783d69e07" Jan 27 05:57:55.996000 audit[5326]: CRED_ACQ pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.064829 kernel: audit: type=1103 audit(1769493475.996:880): pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.071705 systemd-logind[1576]: New session 27 of user core. Jan 27 05:57:56.091396 kernel: audit: type=1006 audit(1769493475.996:881): pid=5326 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 27 05:57:56.093227 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 27 05:57:55.996000 audit[5326]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffead5e95f0 a2=3 a3=0 items=0 ppid=1 pid=5326 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:56.126391 kernel: audit: type=1300 audit(1769493475.996:881): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffead5e95f0 a2=3 a3=0 items=0 ppid=1 pid=5326 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 05:57:55.996000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:56.139207 kernel: audit: type=1327 audit(1769493475.996:881): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 05:57:56.137000 audit[5326]: USER_START pid=5326 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.176437 kernel: audit: type=1105 audit(1769493476.137:882): pid=5326 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.176000 audit[5330]: CRED_ACQ pid=5330 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.211465 kernel: audit: type=1103 audit(1769493476.176:883): pid=5330 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.363665 sshd[5330]: Connection closed by 4.153.228.146 port 59708 Jan 27 05:57:56.368725 sshd-session[5326]: pam_unix(sshd:session): session closed for user core Jan 27 05:57:56.372000 audit[5326]: USER_END pid=5326 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.379427 systemd-logind[1576]: Session 27 logged out. Waiting for processes to exit. Jan 27 05:57:56.381268 systemd[1]: sshd@25-10.128.0.23:22-4.153.228.146:59708.service: Deactivated successfully. Jan 27 05:57:56.386505 systemd[1]: session-27.scope: Deactivated successfully. Jan 27 05:57:56.392281 systemd-logind[1576]: Removed session 27. Jan 27 05:57:56.372000 audit[5326]: CRED_DISP pid=5326 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.435456 kernel: audit: type=1106 audit(1769493476.372:884): pid=5326 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_mkhomedir,pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.435553 kernel: audit: type=1104 audit(1769493476.372:885): pid=5326 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_oslogin_login acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 05:57:56.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.128.0.23:22-4.153.228.146:59708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 05:57:57.853686 containerd[1613]: time="2026-01-27T05:57:57.853517019Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 05:57:58.012196 containerd[1613]: time="2026-01-27T05:57:58.012037849Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:58.013703 containerd[1613]: time="2026-01-27T05:57:58.013541141Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 05:57:58.013703 containerd[1613]: time="2026-01-27T05:57:58.013663743Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:58.014541 kubelet[2857]: E0127 05:57:58.014161 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:57:58.014541 kubelet[2857]: E0127 05:57:58.014233 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 05:57:58.014541 kubelet[2857]: E0127 05:57:58.014432 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:58.017780 containerd[1613]: time="2026-01-27T05:57:58.017680123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 05:57:58.018793 update_engine[1580]: I20260127 05:57:58.018733 1580 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 27 05:57:58.019937 update_engine[1580]: I20260127 05:57:58.019305 1580 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 27 05:57:58.019937 update_engine[1580]: I20260127 05:57:58.019829 1580 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 27 05:57:58.026005 update_engine[1580]: E20260127 05:57:58.025696 1580 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 27 05:57:58.026005 update_engine[1580]: I20260127 05:57:58.025792 1580 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 27 05:57:58.187803 containerd[1613]: time="2026-01-27T05:57:58.186965990Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 05:57:58.190025 containerd[1613]: time="2026-01-27T05:57:58.189980061Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 05:57:58.191472 containerd[1613]: time="2026-01-27T05:57:58.191413395Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 05:57:58.192013 kubelet[2857]: E0127 05:57:58.191909 2857 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:57:58.192013 kubelet[2857]: E0127 05:57:58.191984 2857 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 05:57:58.192799 kubelet[2857]: E0127 05:57:58.192725 2857 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2k2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-s7bgd_calico-system(8f740da6-d731-4b30-bf8e-ada1ccd8b61b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 05:57:58.194262 kubelet[2857]: E0127 05:57:58.194201 2857 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-s7bgd" podUID="8f740da6-d731-4b30-bf8e-ada1ccd8b61b"