Jan 14 01:16:19.288234 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Jan 13 22:26:24 -00 2026 Jan 14 01:16:19.288262 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ef461ed71f713584f576c99df12ffb04dd99b33cd2d16edeb307d0cf2f5b4260 Jan 14 01:16:19.288272 kernel: BIOS-provided physical RAM map: Jan 14 01:16:19.288278 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 14 01:16:19.288286 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Jan 14 01:16:19.288291 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 14 01:16:19.288297 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 14 01:16:19.288302 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 14 01:16:19.288307 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 14 01:16:19.288312 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 14 01:16:19.288317 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 14 01:16:19.288322 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 14 01:16:19.288329 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 14 01:16:19.288334 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 14 01:16:19.288340 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 14 01:16:19.288346 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 14 01:16:19.288351 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 14 01:16:19.288359 kernel: NX (Execute Disable) protection: active Jan 14 01:16:19.288364 kernel: APIC: Static calls initialized Jan 14 01:16:19.288369 kernel: e820: update [mem 0x7dfab018-0x7dfb4a57] usable ==> usable Jan 14 01:16:19.288375 kernel: e820: update [mem 0x7df6f018-0x7dfaa657] usable ==> usable Jan 14 01:16:19.288380 kernel: e820: update [mem 0x7df33018-0x7df6e657] usable ==> usable Jan 14 01:16:19.288385 kernel: extended physical RAM map: Jan 14 01:16:19.288390 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 14 01:16:19.288395 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007df33017] usable Jan 14 01:16:19.288401 kernel: reserve setup_data: [mem 0x000000007df33018-0x000000007df6e657] usable Jan 14 01:16:19.288406 kernel: reserve setup_data: [mem 0x000000007df6e658-0x000000007df6f017] usable Jan 14 01:16:19.288413 kernel: reserve setup_data: [mem 0x000000007df6f018-0x000000007dfaa657] usable Jan 14 01:16:19.288418 kernel: reserve setup_data: [mem 0x000000007dfaa658-0x000000007dfab017] usable Jan 14 01:16:19.288423 kernel: reserve setup_data: [mem 0x000000007dfab018-0x000000007dfb4a57] usable Jan 14 01:16:19.288429 kernel: reserve setup_data: [mem 0x000000007dfb4a58-0x000000007ed3efff] usable Jan 14 01:16:19.288434 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 14 01:16:19.288439 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 14 01:16:19.288444 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 14 01:16:19.288449 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 14 01:16:19.288454 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 14 01:16:19.288459 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 14 01:16:19.288465 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 14 01:16:19.288480 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 14 01:16:19.288486 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 14 01:16:19.288494 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 14 01:16:19.288499 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 14 01:16:19.288507 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 14 01:16:19.288513 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jan 14 01:16:19.288518 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e01b198 RNG=0x7fb73018 Jan 14 01:16:19.288524 kernel: random: crng init done Jan 14 01:16:19.288529 kernel: efi: Remove mem136: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 14 01:16:19.288535 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 14 01:16:19.288540 kernel: secureboot: Secure boot disabled Jan 14 01:16:19.288546 kernel: SMBIOS 3.0.0 present. Jan 14 01:16:19.288551 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jan 14 01:16:19.288559 kernel: DMI: Memory slots populated: 1/1 Jan 14 01:16:19.288564 kernel: Hypervisor detected: KVM Jan 14 01:16:19.288570 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 14 01:16:19.288575 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 14 01:16:19.288580 kernel: kvm-clock: using sched offset of 13028526676 cycles Jan 14 01:16:19.288586 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 14 01:16:19.288592 kernel: tsc: Detected 2399.998 MHz processor Jan 14 01:16:19.288598 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 14 01:16:19.288604 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 14 01:16:19.288610 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Jan 14 01:16:19.288618 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 14 01:16:19.288624 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 14 01:16:19.288629 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 14 01:16:19.288635 kernel: Using GB pages for direct mapping Jan 14 01:16:19.288641 kernel: ACPI: Early table checksum verification disabled Jan 14 01:16:19.288646 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 14 01:16:19.288652 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Jan 14 01:16:19.288660 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:16:19.288666 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:16:19.288672 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 14 01:16:19.288677 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:16:19.288683 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:16:19.288689 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:16:19.288695 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:16:19.288703 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 14 01:16:19.288708 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Jan 14 01:16:19.288714 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Jan 14 01:16:19.288720 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 14 01:16:19.288725 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Jan 14 01:16:19.288731 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Jan 14 01:16:19.288737 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Jan 14 01:16:19.288744 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Jan 14 01:16:19.288750 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Jan 14 01:16:19.288755 kernel: No NUMA configuration found Jan 14 01:16:19.288761 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Jan 14 01:16:19.288767 kernel: NODE_DATA(0) allocated [mem 0x179ff8dc0-0x179ffffff] Jan 14 01:16:19.288773 kernel: Zone ranges: Jan 14 01:16:19.288778 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 14 01:16:19.288784 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 14 01:16:19.288792 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Jan 14 01:16:19.288798 kernel: Device empty Jan 14 01:16:19.288803 kernel: Movable zone start for each node Jan 14 01:16:19.288809 kernel: Early memory node ranges Jan 14 01:16:19.288814 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 14 01:16:19.288820 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Jan 14 01:16:19.288826 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Jan 14 01:16:19.288831 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Jan 14 01:16:19.288839 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Jan 14 01:16:19.288845 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Jan 14 01:16:19.288850 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 14 01:16:19.288856 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 14 01:16:19.288862 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 14 01:16:19.288868 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 14 01:16:19.288873 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Jan 14 01:16:19.288881 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 14 01:16:19.288887 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 14 01:16:19.288893 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 14 01:16:19.288899 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 14 01:16:19.288904 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 14 01:16:19.288910 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 14 01:16:19.288916 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 14 01:16:19.288924 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 14 01:16:19.288930 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 14 01:16:19.288935 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 14 01:16:19.288941 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jan 14 01:16:19.288947 kernel: CPU topo: Max. logical packages: 1 Jan 14 01:16:19.288953 kernel: CPU topo: Max. logical dies: 1 Jan 14 01:16:19.288967 kernel: CPU topo: Max. dies per package: 1 Jan 14 01:16:19.288972 kernel: CPU topo: Max. threads per core: 1 Jan 14 01:16:19.288978 kernel: CPU topo: Num. cores per package: 2 Jan 14 01:16:19.288984 kernel: CPU topo: Num. threads per package: 2 Jan 14 01:16:19.288992 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 14 01:16:19.288998 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 14 01:16:19.289004 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 14 01:16:19.289010 kernel: Booting paravirtualized kernel on KVM Jan 14 01:16:19.289016 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 14 01:16:19.289025 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 14 01:16:19.289030 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 14 01:16:19.289036 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 14 01:16:19.289042 kernel: pcpu-alloc: [0] 0 1 Jan 14 01:16:19.289048 kernel: kvm-guest: PV spinlocks disabled, no host support Jan 14 01:16:19.289054 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ef461ed71f713584f576c99df12ffb04dd99b33cd2d16edeb307d0cf2f5b4260 Jan 14 01:16:19.289063 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 14 01:16:19.289069 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 14 01:16:19.289075 kernel: Fallback order for Node 0: 0 Jan 14 01:16:19.289081 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Jan 14 01:16:19.289087 kernel: Policy zone: Normal Jan 14 01:16:19.289105 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 14 01:16:19.289111 kernel: software IO TLB: area num 2. Jan 14 01:16:19.289119 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 14 01:16:19.289125 kernel: ftrace: allocating 40128 entries in 157 pages Jan 14 01:16:19.289131 kernel: ftrace: allocated 157 pages with 5 groups Jan 14 01:16:19.289137 kernel: Dynamic Preempt: voluntary Jan 14 01:16:19.289143 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 14 01:16:19.289150 kernel: rcu: RCU event tracing is enabled. Jan 14 01:16:19.289156 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 14 01:16:19.289163 kernel: Trampoline variant of Tasks RCU enabled. Jan 14 01:16:19.289171 kernel: Rude variant of Tasks RCU enabled. Jan 14 01:16:19.289177 kernel: Tracing variant of Tasks RCU enabled. Jan 14 01:16:19.289183 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 14 01:16:19.289189 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 14 01:16:19.289194 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 01:16:19.289201 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 01:16:19.289207 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 01:16:19.289215 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 14 01:16:19.289221 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 14 01:16:19.289227 kernel: Console: colour dummy device 80x25 Jan 14 01:16:19.289233 kernel: printk: legacy console [tty0] enabled Jan 14 01:16:19.289239 kernel: printk: legacy console [ttyS0] enabled Jan 14 01:16:19.289245 kernel: ACPI: Core revision 20240827 Jan 14 01:16:19.289251 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jan 14 01:16:19.289259 kernel: APIC: Switch to symmetric I/O mode setup Jan 14 01:16:19.289265 kernel: x2apic enabled Jan 14 01:16:19.289271 kernel: APIC: Switched APIC routing to: physical x2apic Jan 14 01:16:19.289277 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 14 01:16:19.289283 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 14 01:16:19.289290 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Jan 14 01:16:19.289296 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 14 01:16:19.289304 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jan 14 01:16:19.289309 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jan 14 01:16:19.289315 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 14 01:16:19.289321 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 14 01:16:19.289327 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 14 01:16:19.289333 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 14 01:16:19.289339 kernel: active return thunk: srso_alias_return_thunk Jan 14 01:16:19.289347 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Jan 14 01:16:19.289353 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Jan 14 01:16:19.289359 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Jan 14 01:16:19.289365 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 14 01:16:19.289371 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 14 01:16:19.289377 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 14 01:16:19.289383 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 14 01:16:19.289391 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 14 01:16:19.289397 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 14 01:16:19.289403 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 14 01:16:19.289409 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 14 01:16:19.289415 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 14 01:16:19.289421 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 14 01:16:19.289427 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 14 01:16:19.289435 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 14 01:16:19.289441 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 14 01:16:19.289447 kernel: Freeing SMP alternatives memory: 32K Jan 14 01:16:19.289453 kernel: pid_max: default: 32768 minimum: 301 Jan 14 01:16:19.289459 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 14 01:16:19.289465 kernel: landlock: Up and running. Jan 14 01:16:19.289480 kernel: SELinux: Initializing. Jan 14 01:16:19.289489 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 01:16:19.289495 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 01:16:19.289501 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Jan 14 01:16:19.289506 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jan 14 01:16:19.289512 kernel: ... version: 0 Jan 14 01:16:19.289518 kernel: ... bit width: 48 Jan 14 01:16:19.289524 kernel: ... generic registers: 6 Jan 14 01:16:19.289530 kernel: ... value mask: 0000ffffffffffff Jan 14 01:16:19.289539 kernel: ... max period: 00007fffffffffff Jan 14 01:16:19.289545 kernel: ... fixed-purpose events: 0 Jan 14 01:16:19.289551 kernel: ... event mask: 000000000000003f Jan 14 01:16:19.289557 kernel: signal: max sigframe size: 3376 Jan 14 01:16:19.289563 kernel: rcu: Hierarchical SRCU implementation. Jan 14 01:16:19.289569 kernel: rcu: Max phase no-delay instances is 400. Jan 14 01:16:19.289575 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 14 01:16:19.289584 kernel: smp: Bringing up secondary CPUs ... Jan 14 01:16:19.289589 kernel: smpboot: x86: Booting SMP configuration: Jan 14 01:16:19.289595 kernel: .... node #0, CPUs: #1 Jan 14 01:16:19.289602 kernel: smp: Brought up 1 node, 2 CPUs Jan 14 01:16:19.289608 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Jan 14 01:16:19.289614 kernel: Memory: 3873096K/4091168K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15536K init, 2500K bss, 212440K reserved, 0K cma-reserved) Jan 14 01:16:19.289620 kernel: devtmpfs: initialized Jan 14 01:16:19.289628 kernel: x86/mm: Memory block size: 128MB Jan 14 01:16:19.289634 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 14 01:16:19.289640 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 14 01:16:19.289646 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 14 01:16:19.289652 kernel: pinctrl core: initialized pinctrl subsystem Jan 14 01:16:19.289658 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 14 01:16:19.289667 kernel: audit: initializing netlink subsys (disabled) Jan 14 01:16:19.289679 kernel: audit: type=2000 audit(1768353376.953:1): state=initialized audit_enabled=0 res=1 Jan 14 01:16:19.289687 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 14 01:16:19.289693 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 14 01:16:19.289699 kernel: cpuidle: using governor menu Jan 14 01:16:19.289705 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 14 01:16:19.289711 kernel: dca service started, version 1.12.1 Jan 14 01:16:19.289717 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 14 01:16:19.289725 kernel: PCI: Using configuration type 1 for base access Jan 14 01:16:19.289732 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 14 01:16:19.289738 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 14 01:16:19.289744 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 14 01:16:19.289750 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 14 01:16:19.289756 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 14 01:16:19.289762 kernel: ACPI: Added _OSI(Module Device) Jan 14 01:16:19.289770 kernel: ACPI: Added _OSI(Processor Device) Jan 14 01:16:19.289776 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 14 01:16:19.289782 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 14 01:16:19.289788 kernel: ACPI: Interpreter enabled Jan 14 01:16:19.289794 kernel: ACPI: PM: (supports S0 S5) Jan 14 01:16:19.289800 kernel: ACPI: Using IOAPIC for interrupt routing Jan 14 01:16:19.289806 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 14 01:16:19.289814 kernel: PCI: Using E820 reservations for host bridge windows Jan 14 01:16:19.289820 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 14 01:16:19.289826 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 14 01:16:19.290046 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 14 01:16:19.290219 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 14 01:16:19.290423 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 14 01:16:19.290436 kernel: PCI host bridge to bus 0000:00 Jan 14 01:16:19.290594 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 14 01:16:19.290728 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 14 01:16:19.290860 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 14 01:16:19.290990 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 14 01:16:19.291138 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 14 01:16:19.291274 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Jan 14 01:16:19.291406 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 14 01:16:19.291575 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 14 01:16:19.291730 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 14 01:16:19.291874 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 14 01:16:19.292021 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Jan 14 01:16:19.292185 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Jan 14 01:16:19.292329 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 14 01:16:19.293314 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 14 01:16:19.293488 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.298996 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Jan 14 01:16:19.299284 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 14 01:16:19.299492 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 14 01:16:19.299664 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 14 01:16:19.299847 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.300017 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Jan 14 01:16:19.300217 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 14 01:16:19.300374 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 14 01:16:19.300559 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.301774 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Jan 14 01:16:19.301940 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 14 01:16:19.302088 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 14 01:16:19.302256 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 14 01:16:19.302409 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.302566 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Jan 14 01:16:19.302720 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 14 01:16:19.302866 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 14 01:16:19.303022 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.303183 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Jan 14 01:16:19.303328 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 14 01:16:19.303481 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 14 01:16:19.303625 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 14 01:16:19.303789 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.303936 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Jan 14 01:16:19.304083 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 14 01:16:19.308201 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 14 01:16:19.308370 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 14 01:16:19.308543 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.308695 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Jan 14 01:16:19.308839 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 14 01:16:19.308987 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 14 01:16:19.309495 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 14 01:16:19.309656 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.309815 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Jan 14 01:16:19.309958 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 14 01:16:19.312568 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 14 01:16:19.312723 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 14 01:16:19.312884 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:16:19.314820 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Jan 14 01:16:19.314973 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 14 01:16:19.315131 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 14 01:16:19.315279 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 14 01:16:19.315427 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 14 01:16:19.315581 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 14 01:16:19.315730 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 14 01:16:19.315873 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Jan 14 01:16:19.316040 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Jan 14 01:16:19.318277 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 14 01:16:19.318455 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Jan 14 01:16:19.318718 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 14 01:16:19.318871 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Jan 14 01:16:19.319019 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Jan 14 01:16:19.319188 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 14 01:16:19.319332 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 14 01:16:19.319934 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 14 01:16:19.320084 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Jan 14 01:16:19.320250 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 14 01:16:19.320403 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jan 14 01:16:19.320567 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Jan 14 01:16:19.320713 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Jan 14 01:16:19.320855 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 14 01:16:19.321030 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 14 01:16:19.321202 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Jan 14 01:16:19.321353 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 14 01:16:19.321520 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 14 01:16:19.321672 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Jan 14 01:16:19.321817 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Jan 14 01:16:19.321971 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 14 01:16:19.322698 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jan 14 01:16:19.322864 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Jan 14 01:16:19.323014 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Jan 14 01:16:19.323172 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 14 01:16:19.323182 kernel: acpiphp: Slot [0] registered Jan 14 01:16:19.323335 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 14 01:16:19.323489 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Jan 14 01:16:19.323639 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Jan 14 01:16:19.323783 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 14 01:16:19.323926 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 14 01:16:19.323935 kernel: acpiphp: Slot [0-2] registered Jan 14 01:16:19.324077 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 14 01:16:19.324085 kernel: acpiphp: Slot [0-3] registered Jan 14 01:16:19.324251 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 14 01:16:19.324262 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 14 01:16:19.324281 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 14 01:16:19.324289 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 14 01:16:19.324296 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 14 01:16:19.324302 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 14 01:16:19.324309 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 14 01:16:19.324318 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 14 01:16:19.324324 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 14 01:16:19.324331 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 14 01:16:19.324356 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 14 01:16:19.324363 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 14 01:16:19.324369 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 14 01:16:19.324375 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 14 01:16:19.324384 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 14 01:16:19.324391 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 14 01:16:19.324399 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 14 01:16:19.324406 kernel: iommu: Default domain type: Translated Jan 14 01:16:19.324414 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 14 01:16:19.324421 kernel: efivars: Registered efivars operations Jan 14 01:16:19.324427 kernel: PCI: Using ACPI for IRQ routing Jan 14 01:16:19.324434 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 14 01:16:19.324441 kernel: e820: reserve RAM buffer [mem 0x7df33018-0x7fffffff] Jan 14 01:16:19.324447 kernel: e820: reserve RAM buffer [mem 0x7df6f018-0x7fffffff] Jan 14 01:16:19.324453 kernel: e820: reserve RAM buffer [mem 0x7dfab018-0x7fffffff] Jan 14 01:16:19.324461 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Jan 14 01:16:19.324477 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 14 01:16:19.324483 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Jan 14 01:16:19.324490 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Jan 14 01:16:19.324637 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 14 01:16:19.324777 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 14 01:16:19.324933 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 14 01:16:19.324944 kernel: vgaarb: loaded Jan 14 01:16:19.324950 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jan 14 01:16:19.324957 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jan 14 01:16:19.324963 kernel: clocksource: Switched to clocksource kvm-clock Jan 14 01:16:19.324970 kernel: VFS: Disk quotas dquot_6.6.0 Jan 14 01:16:19.324976 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 14 01:16:19.324983 kernel: pnp: PnP ACPI init Jan 14 01:16:19.325151 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 14 01:16:19.325160 kernel: pnp: PnP ACPI: found 5 devices Jan 14 01:16:19.325167 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 14 01:16:19.325173 kernel: NET: Registered PF_INET protocol family Jan 14 01:16:19.325180 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 14 01:16:19.325187 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 14 01:16:19.325193 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 14 01:16:19.325202 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 14 01:16:19.325211 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 14 01:16:19.325218 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 14 01:16:19.325224 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 01:16:19.325230 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 01:16:19.325237 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 14 01:16:19.325243 kernel: NET: Registered PF_XDP protocol family Jan 14 01:16:19.325393 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 14 01:16:19.325553 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 14 01:16:19.325695 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 14 01:16:19.325835 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 14 01:16:19.325981 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 14 01:16:19.326154 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jan 14 01:16:19.326341 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jan 14 01:16:19.326523 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jan 14 01:16:19.326775 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Jan 14 01:16:19.326919 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 14 01:16:19.327060 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 14 01:16:19.329178 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 14 01:16:19.329343 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 14 01:16:19.329505 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 14 01:16:19.329651 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 14 01:16:19.329792 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 14 01:16:19.329932 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 14 01:16:19.330075 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 14 01:16:19.331294 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 14 01:16:19.331447 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 14 01:16:19.331609 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 14 01:16:19.331752 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 14 01:16:19.331899 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 14 01:16:19.332041 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 14 01:16:19.332198 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 14 01:16:19.332348 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Jan 14 01:16:19.332502 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 14 01:16:19.332649 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jan 14 01:16:19.332794 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 14 01:16:19.332995 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 14 01:16:19.337240 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 14 01:16:19.337424 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jan 14 01:16:19.337588 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 14 01:16:19.337739 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 14 01:16:19.337903 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 14 01:16:19.338050 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jan 14 01:16:19.338207 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 14 01:16:19.338353 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 14 01:16:19.338508 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 14 01:16:19.338665 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 14 01:16:19.338801 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 14 01:16:19.338935 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 14 01:16:19.339069 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 14 01:16:19.339853 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Jan 14 01:16:19.340008 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Jan 14 01:16:19.340169 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 14 01:16:19.340314 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Jan 14 01:16:19.340461 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Jan 14 01:16:19.340609 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 14 01:16:19.340756 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 14 01:16:19.340904 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Jan 14 01:16:19.341042 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 14 01:16:19.341201 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Jan 14 01:16:19.341382 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 14 01:16:19.341540 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jan 14 01:16:19.341681 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Jan 14 01:16:19.341817 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 14 01:16:19.341984 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jan 14 01:16:19.342164 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Jan 14 01:16:19.342305 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 14 01:16:19.342449 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jan 14 01:16:19.342601 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Jan 14 01:16:19.342737 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 14 01:16:19.342746 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 14 01:16:19.342753 kernel: PCI: CLS 0 bytes, default 64 Jan 14 01:16:19.342759 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 14 01:16:19.342766 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Jan 14 01:16:19.342776 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 14 01:16:19.342782 kernel: Initialise system trusted keyrings Jan 14 01:16:19.342789 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 14 01:16:19.342795 kernel: Key type asymmetric registered Jan 14 01:16:19.342802 kernel: Asymmetric key parser 'x509' registered Jan 14 01:16:19.342808 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 14 01:16:19.342815 kernel: io scheduler mq-deadline registered Jan 14 01:16:19.342823 kernel: io scheduler kyber registered Jan 14 01:16:19.342830 kernel: io scheduler bfq registered Jan 14 01:16:19.342977 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 14 01:16:19.343132 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 14 01:16:19.343278 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 14 01:16:19.343418 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 14 01:16:19.343571 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 14 01:16:19.343718 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 14 01:16:19.343860 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 14 01:16:19.344001 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 14 01:16:19.344154 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 14 01:16:19.344296 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 14 01:16:19.344437 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 14 01:16:19.344591 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 14 01:16:19.344734 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 14 01:16:19.344874 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 14 01:16:19.345017 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 14 01:16:19.345168 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 14 01:16:19.345177 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 14 01:16:19.345322 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 14 01:16:19.345464 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 14 01:16:19.345482 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 14 01:16:19.345489 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 14 01:16:19.345496 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 14 01:16:19.345502 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 14 01:16:19.345512 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 14 01:16:19.345519 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 14 01:16:19.345525 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 14 01:16:19.345675 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 14 01:16:19.345684 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 14 01:16:19.345819 kernel: rtc_cmos 00:03: registered as rtc0 Jan 14 01:16:19.345958 kernel: rtc_cmos 00:03: setting system clock to 2026-01-14T01:16:17 UTC (1768353377) Jan 14 01:16:19.346103 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jan 14 01:16:19.346111 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Jan 14 01:16:19.346119 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jan 14 01:16:19.346125 kernel: efifb: probing for efifb Jan 14 01:16:19.346132 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 14 01:16:19.346138 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 14 01:16:19.346147 kernel: efifb: scrolling: redraw Jan 14 01:16:19.346154 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 14 01:16:19.346160 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 01:16:19.346167 kernel: fb0: EFI VGA frame buffer device Jan 14 01:16:19.346174 kernel: pstore: Using crash dump compression: deflate Jan 14 01:16:19.346184 kernel: pstore: Registered efi_pstore as persistent store backend Jan 14 01:16:19.346194 kernel: NET: Registered PF_INET6 protocol family Jan 14 01:16:19.346206 kernel: Segment Routing with IPv6 Jan 14 01:16:19.346219 kernel: In-situ OAM (IOAM) with IPv6 Jan 14 01:16:19.346228 kernel: NET: Registered PF_PACKET protocol family Jan 14 01:16:19.346238 kernel: Key type dns_resolver registered Jan 14 01:16:19.346245 kernel: IPI shorthand broadcast: enabled Jan 14 01:16:19.346252 kernel: sched_clock: Marking stable (1971011827, 237644061)->(2247589236, -38933348) Jan 14 01:16:19.346258 kernel: registered taskstats version 1 Jan 14 01:16:19.346267 kernel: Loading compiled-in X.509 certificates Jan 14 01:16:19.346273 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: e43fcdb17feb86efe6ca4b76910b93467fb95f4f' Jan 14 01:16:19.346280 kernel: Demotion targets for Node 0: null Jan 14 01:16:19.346286 kernel: Key type .fscrypt registered Jan 14 01:16:19.346293 kernel: Key type fscrypt-provisioning registered Jan 14 01:16:19.346299 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 14 01:16:19.346306 kernel: ima: Allocated hash algorithm: sha1 Jan 14 01:16:19.346314 kernel: ima: No architecture policies found Jan 14 01:16:19.346321 kernel: clk: Disabling unused clocks Jan 14 01:16:19.346327 kernel: Freeing unused kernel image (initmem) memory: 15536K Jan 14 01:16:19.346333 kernel: Write protecting the kernel read-only data: 47104k Jan 14 01:16:19.346340 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 14 01:16:19.346346 kernel: Run /init as init process Jan 14 01:16:19.346353 kernel: with arguments: Jan 14 01:16:19.346362 kernel: /init Jan 14 01:16:19.346368 kernel: with environment: Jan 14 01:16:19.346374 kernel: HOME=/ Jan 14 01:16:19.346380 kernel: TERM=linux Jan 14 01:16:19.346387 kernel: ACPI: bus type USB registered Jan 14 01:16:19.346393 kernel: usbcore: registered new interface driver usbfs Jan 14 01:16:19.346399 kernel: usbcore: registered new interface driver hub Jan 14 01:16:19.346406 kernel: usbcore: registered new device driver usb Jan 14 01:16:19.346579 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 14 01:16:19.346727 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 14 01:16:19.346885 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 14 01:16:19.347041 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 14 01:16:19.347230 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 14 01:16:19.347376 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 14 01:16:19.347570 kernel: hub 1-0:1.0: USB hub found Jan 14 01:16:19.347728 kernel: hub 1-0:1.0: 4 ports detected Jan 14 01:16:19.347898 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 14 01:16:19.348064 kernel: hub 2-0:1.0: USB hub found Jan 14 01:16:19.348258 kernel: hub 2-0:1.0: 4 ports detected Jan 14 01:16:19.348271 kernel: SCSI subsystem initialized Jan 14 01:16:19.348278 kernel: libata version 3.00 loaded. Jan 14 01:16:19.348423 kernel: ahci 0000:00:1f.2: version 3.0 Jan 14 01:16:19.348432 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 14 01:16:19.348636 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 14 01:16:19.348817 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 14 01:16:19.348986 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 14 01:16:19.349238 kernel: scsi host0: ahci Jan 14 01:16:19.349431 kernel: scsi host1: ahci Jan 14 01:16:19.349635 kernel: scsi host2: ahci Jan 14 01:16:19.349824 kernel: scsi host3: ahci Jan 14 01:16:19.349999 kernel: scsi host4: ahci Jan 14 01:16:19.350191 kernel: scsi host5: ahci Jan 14 01:16:19.350202 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 38 lpm-pol 1 Jan 14 01:16:19.350209 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 38 lpm-pol 1 Jan 14 01:16:19.350215 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 38 lpm-pol 1 Jan 14 01:16:19.350222 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 38 lpm-pol 1 Jan 14 01:16:19.350228 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 38 lpm-pol 1 Jan 14 01:16:19.350238 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 38 lpm-pol 1 Jan 14 01:16:19.350410 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 14 01:16:19.350420 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 14 01:16:19.350427 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 14 01:16:19.350433 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jan 14 01:16:19.350440 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 14 01:16:19.350448 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 14 01:16:19.350454 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 14 01:16:19.350461 kernel: ata1.00: LPM support broken, forcing max_power Jan 14 01:16:19.350480 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jan 14 01:16:19.350486 kernel: ata1.00: applying bridge limits Jan 14 01:16:19.350493 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 14 01:16:19.350499 kernel: ata1.00: LPM support broken, forcing max_power Jan 14 01:16:19.350506 kernel: ata1.00: configured for UDMA/100 Jan 14 01:16:19.350678 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 14 01:16:19.350687 kernel: usbcore: registered new interface driver usbhid Jan 14 01:16:19.350694 kernel: usbhid: USB HID core driver Jan 14 01:16:19.350852 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jan 14 01:16:19.351027 kernel: scsi host6: Virtio SCSI HBA Jan 14 01:16:19.351219 kernel: scsi 6:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 14 01:16:19.351383 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jan 14 01:16:19.351391 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 14 01:16:19.351554 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 14 01:16:19.351563 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 14 01:16:19.351744 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 14 01:16:19.351910 kernel: sd 6:0:0:0: Power-on or device reset occurred Jan 14 01:16:19.352070 kernel: sd 6:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Jan 14 01:16:19.352265 kernel: sd 6:0:0:0: [sda] Write Protect is off Jan 14 01:16:19.352428 kernel: sd 6:0:0:0: [sda] Mode Sense: 63 00 00 08 Jan 14 01:16:19.352618 kernel: sd 6:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 14 01:16:19.352627 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 14 01:16:19.352637 kernel: GPT:25804799 != 160006143 Jan 14 01:16:19.352644 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 14 01:16:19.352650 kernel: GPT:25804799 != 160006143 Jan 14 01:16:19.352657 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 14 01:16:19.352663 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 14 01:16:19.352822 kernel: sd 6:0:0:0: [sda] Attached SCSI disk Jan 14 01:16:19.352830 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 14 01:16:19.352839 kernel: device-mapper: uevent: version 1.0.3 Jan 14 01:16:19.352845 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 14 01:16:19.352852 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 14 01:16:19.352858 kernel: raid6: avx512x4 gen() 40437 MB/s Jan 14 01:16:19.352865 kernel: raid6: avx512x2 gen() 40993 MB/s Jan 14 01:16:19.352871 kernel: raid6: avx512x1 gen() 41865 MB/s Jan 14 01:16:19.352877 kernel: raid6: avx2x4 gen() 43916 MB/s Jan 14 01:16:19.352886 kernel: raid6: avx2x2 gen() 46753 MB/s Jan 14 01:16:19.352892 kernel: raid6: avx2x1 gen() 38293 MB/s Jan 14 01:16:19.352898 kernel: raid6: using algorithm avx2x2 gen() 46753 MB/s Jan 14 01:16:19.352905 kernel: raid6: .... xor() 34184 MB/s, rmw enabled Jan 14 01:16:19.352911 kernel: raid6: using avx512x2 recovery algorithm Jan 14 01:16:19.352918 kernel: xor: automatically using best checksumming function avx Jan 14 01:16:19.352924 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 14 01:16:19.352933 kernel: BTRFS: device fsid cd6116b6-e1b6-44f4-b1e2-5e7c5565b295 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (183) Jan 14 01:16:19.352939 kernel: BTRFS info (device dm-0): first mount of filesystem cd6116b6-e1b6-44f4-b1e2-5e7c5565b295 Jan 14 01:16:19.352946 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:16:19.352952 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 14 01:16:19.352958 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 14 01:16:19.352965 kernel: BTRFS info (device dm-0): enabling free space tree Jan 14 01:16:19.352971 kernel: loop: module loaded Jan 14 01:16:19.352980 kernel: loop0: detected capacity change from 0 to 100544 Jan 14 01:16:19.352986 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 14 01:16:19.352994 systemd[1]: Successfully made /usr/ read-only. Jan 14 01:16:19.353002 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 01:16:19.353010 systemd[1]: Detected virtualization kvm. Jan 14 01:16:19.353017 systemd[1]: Detected architecture x86-64. Jan 14 01:16:19.353025 systemd[1]: Running in initrd. Jan 14 01:16:19.353032 systemd[1]: No hostname configured, using default hostname. Jan 14 01:16:19.353039 systemd[1]: Hostname set to . Jan 14 01:16:19.353046 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 01:16:19.353053 systemd[1]: Queued start job for default target initrd.target. Jan 14 01:16:19.353060 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 01:16:19.353068 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:16:19.353081 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:16:19.353107 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 14 01:16:19.353116 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 01:16:19.353127 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 14 01:16:19.353137 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 14 01:16:19.353150 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:16:19.353160 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:16:19.353169 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 14 01:16:19.353179 systemd[1]: Reached target paths.target - Path Units. Jan 14 01:16:19.353189 systemd[1]: Reached target slices.target - Slice Units. Jan 14 01:16:19.353196 systemd[1]: Reached target swap.target - Swaps. Jan 14 01:16:19.353203 systemd[1]: Reached target timers.target - Timer Units. Jan 14 01:16:19.353217 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 01:16:19.353227 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 01:16:19.353234 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:16:19.353241 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 14 01:16:19.353252 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 14 01:16:19.353262 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:16:19.353272 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 01:16:19.353282 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:16:19.353289 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 01:16:19.353296 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 14 01:16:19.353303 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 14 01:16:19.353309 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 01:16:19.353316 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 14 01:16:19.353326 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 14 01:16:19.353333 systemd[1]: Starting systemd-fsck-usr.service... Jan 14 01:16:19.353339 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 01:16:19.353346 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 01:16:19.353353 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:16:19.353362 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 14 01:16:19.353369 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:16:19.353405 systemd-journald[320]: Collecting audit messages is enabled. Jan 14 01:16:19.353425 systemd[1]: Finished systemd-fsck-usr.service. Jan 14 01:16:19.353432 kernel: audit: type=1130 audit(1768353379.293:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.353439 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 01:16:19.353446 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 01:16:19.353453 systemd-journald[320]: Journal started Jan 14 01:16:19.353479 systemd-journald[320]: Runtime Journal (/run/log/journal/a534da6f2c294c32971dde88655a083a) is 8M, max 76M, 68M free. Jan 14 01:16:19.293000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.354000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.362617 kernel: audit: type=1130 audit(1768353379.354:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.362659 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 01:16:19.363000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.371118 kernel: audit: type=1130 audit(1768353379.363:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.371153 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 14 01:16:19.369226 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:16:19.383632 kernel: audit: type=1130 audit(1768353379.369:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.369000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.372645 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 14 01:16:19.386125 kernel: Bridge firewalling registered Jan 14 01:16:19.387626 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 01:16:19.388281 systemd-modules-load[321]: Inserted module 'br_netfilter' Jan 14 01:16:19.392222 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 01:16:19.393981 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 01:16:19.403539 kernel: audit: type=1130 audit(1768353379.394:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.394000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.403269 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 01:16:19.409952 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 01:16:19.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.417776 kernel: audit: type=1130 audit(1768353379.411:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.416692 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:16:19.417000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.419748 systemd-tmpfiles[338]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 14 01:16:19.425202 kernel: audit: type=1130 audit(1768353379.417:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.424114 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 14 01:16:19.432301 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:16:19.439574 kernel: audit: type=1130 audit(1768353379.432:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.432000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.438197 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:16:19.440000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.447643 kernel: audit: type=1130 audit(1768353379.440:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.447204 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 01:16:19.441000 audit: BPF prog-id=6 op=LOAD Jan 14 01:16:19.452078 dracut-cmdline[354]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ef461ed71f713584f576c99df12ffb04dd99b33cd2d16edeb307d0cf2f5b4260 Jan 14 01:16:19.500536 systemd-resolved[361]: Positive Trust Anchors: Jan 14 01:16:19.500551 systemd-resolved[361]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 01:16:19.500555 systemd-resolved[361]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 01:16:19.500577 systemd-resolved[361]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 01:16:19.522851 systemd-resolved[361]: Defaulting to hostname 'linux'. Jan 14 01:16:19.523825 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 01:16:19.524000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.524348 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:16:19.551125 kernel: Loading iSCSI transport class v2.0-870. Jan 14 01:16:19.565124 kernel: iscsi: registered transport (tcp) Jan 14 01:16:19.584786 kernel: iscsi: registered transport (qla4xxx) Jan 14 01:16:19.584877 kernel: QLogic iSCSI HBA Driver Jan 14 01:16:19.620571 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 01:16:19.646330 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:16:19.648000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.651753 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 01:16:19.741222 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 14 01:16:19.742000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.745745 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 14 01:16:19.750236 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 14 01:16:19.787143 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 14 01:16:19.789000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.790000 audit: BPF prog-id=7 op=LOAD Jan 14 01:16:19.790000 audit: BPF prog-id=8 op=LOAD Jan 14 01:16:19.792411 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:16:19.840750 systemd-udevd[593]: Using default interface naming scheme 'v257'. Jan 14 01:16:19.861702 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 01:16:19.862000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.865799 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:16:19.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.869264 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 14 01:16:19.870000 audit: BPF prog-id=9 op=LOAD Jan 14 01:16:19.874240 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 01:16:19.912827 dracut-pre-trigger[694]: rd.md=0: removing MD RAID activation Jan 14 01:16:19.919659 systemd-networkd[698]: lo: Link UP Jan 14 01:16:19.921000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.920231 systemd-networkd[698]: lo: Gained carrier Jan 14 01:16:19.921173 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 01:16:19.922602 systemd[1]: Reached target network.target - Network. Jan 14 01:16:19.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:19.952139 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 01:16:19.954226 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 01:16:20.103932 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:16:20.105000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:20.108363 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 14 01:16:20.271868 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 14 01:16:20.285399 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 14 01:16:20.292116 kernel: cryptd: max_cpu_qlen set to 1000 Jan 14 01:16:20.298684 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 14 01:16:20.310436 systemd-networkd[698]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:20.310447 systemd-networkd[698]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:16:20.311076 systemd-networkd[698]: eth0: Link UP Jan 14 01:16:20.311327 systemd-networkd[698]: eth0: Gained carrier Jan 14 01:16:20.311337 systemd-networkd[698]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:20.322171 systemd-networkd[698]: eth0: DHCPv4 address 77.42.79.167/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 14 01:16:20.327418 kernel: AES CTR mode by8 optimization enabled Jan 14 01:16:20.331327 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 14 01:16:20.342212 systemd-networkd[698]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:20.342222 systemd-networkd[698]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:16:20.342559 systemd-networkd[698]: eth1: Link UP Jan 14 01:16:20.343058 systemd-networkd[698]: eth1: Gained carrier Jan 14 01:16:20.343066 systemd-networkd[698]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:20.344351 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 14 01:16:20.357317 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 14 01:16:20.361300 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:16:20.364255 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:16:20.366000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:20.367177 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:16:20.384964 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:16:20.389307 systemd-networkd[698]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 14 01:16:20.391181 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:16:20.391721 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:16:20.392000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:20.392000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:20.397210 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:16:20.400111 disk-uuid[869]: Primary Header is updated. Jan 14 01:16:20.400111 disk-uuid[869]: Secondary Entries is updated. Jan 14 01:16:20.400111 disk-uuid[869]: Secondary Header is updated. Jan 14 01:16:20.407254 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 14 01:16:20.408000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:20.412508 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 01:16:20.412904 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:16:20.414312 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 01:16:20.429607 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 14 01:16:20.435043 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:16:20.436000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:20.465840 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 14 01:16:20.466000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.458597 disk-uuid[874]: Warning: The kernel is still using the old partition table. Jan 14 01:16:21.458597 disk-uuid[874]: The new table will be used at the next reboot or after you Jan 14 01:16:21.458597 disk-uuid[874]: run partprobe(8) or kpartx(8) Jan 14 01:16:21.458597 disk-uuid[874]: The operation has completed successfully. Jan 14 01:16:21.473149 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 14 01:16:21.502728 kernel: kauditd_printk_skb: 19 callbacks suppressed Jan 14 01:16:21.502772 kernel: audit: type=1130 audit(1768353381.474:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.502798 kernel: audit: type=1131 audit(1768353381.474:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.474000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.473409 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 14 01:16:21.477339 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 14 01:16:21.555162 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (901) Jan 14 01:16:21.561864 kernel: BTRFS info (device sda6): first mount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:16:21.561959 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:16:21.575373 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 14 01:16:21.575437 kernel: BTRFS info (device sda6): turning on async discard Jan 14 01:16:21.579336 kernel: BTRFS info (device sda6): enabling free space tree Jan 14 01:16:21.599177 kernel: BTRFS info (device sda6): last unmount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:16:21.600689 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 14 01:16:21.615065 kernel: audit: type=1130 audit(1768353381.601:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.601000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.606433 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 14 01:16:21.797963 ignition[920]: Ignition 2.24.0 Jan 14 01:16:21.797982 ignition[920]: Stage: fetch-offline Jan 14 01:16:21.798035 ignition[920]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:16:21.817221 kernel: audit: type=1130 audit(1768353381.801:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.801000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.800763 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 01:16:21.798050 ignition[920]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 01:16:21.802641 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 14 01:16:21.798183 ignition[920]: parsed url from cmdline: "" Jan 14 01:16:21.798190 ignition[920]: no config URL provided Jan 14 01:16:21.798197 ignition[920]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 01:16:21.798212 ignition[920]: no config at "/usr/lib/ignition/user.ign" Jan 14 01:16:21.798219 ignition[920]: failed to fetch config: resource requires networking Jan 14 01:16:21.798524 ignition[920]: Ignition finished successfully Jan 14 01:16:21.834297 ignition[926]: Ignition 2.24.0 Jan 14 01:16:21.834309 ignition[926]: Stage: fetch Jan 14 01:16:21.834433 ignition[926]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:16:21.834441 ignition[926]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 01:16:21.834529 ignition[926]: parsed url from cmdline: "" Jan 14 01:16:21.834532 ignition[926]: no config URL provided Jan 14 01:16:21.834539 ignition[926]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 01:16:21.834547 ignition[926]: no config at "/usr/lib/ignition/user.ign" Jan 14 01:16:21.834581 ignition[926]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 14 01:16:21.841514 ignition[926]: GET result: OK Jan 14 01:16:21.841609 ignition[926]: parsing config with SHA512: e5beadb4501aedf4e5d0444ad92b42fa7952cb39a25e4239743a9d9536c1b4c67f4c9e556d4faed969e615b79c823826eb68fddcc2c27052a6e21076887857dc Jan 14 01:16:21.848051 unknown[926]: fetched base config from "system" Jan 14 01:16:21.848061 unknown[926]: fetched base config from "system" Jan 14 01:16:21.848404 ignition[926]: fetch: fetch complete Jan 14 01:16:21.848067 unknown[926]: fetched user config from "hetzner" Jan 14 01:16:21.865522 kernel: audit: type=1130 audit(1768353381.852:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.852000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.848410 ignition[926]: fetch: fetch passed Jan 14 01:16:21.851367 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 14 01:16:21.848453 ignition[926]: Ignition finished successfully Jan 14 01:16:21.853081 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 14 01:16:21.894425 ignition[933]: Ignition 2.24.0 Jan 14 01:16:21.895185 ignition[933]: Stage: kargs Jan 14 01:16:21.895381 ignition[933]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:16:21.895390 ignition[933]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 01:16:21.895957 ignition[933]: kargs: kargs passed Jan 14 01:16:21.911727 kernel: audit: type=1130 audit(1768353381.898:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.898000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.897811 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 14 01:16:21.896010 ignition[933]: Ignition finished successfully Jan 14 01:16:21.901121 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 14 01:16:21.934989 ignition[939]: Ignition 2.24.0 Jan 14 01:16:21.935006 ignition[939]: Stage: disks Jan 14 01:16:21.935189 ignition[939]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:16:21.937630 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 14 01:16:21.952441 kernel: audit: type=1130 audit(1768353381.938:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.938000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:21.935202 ignition[939]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 01:16:21.939535 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 14 01:16:21.935876 ignition[939]: disks: disks passed Jan 14 01:16:21.953086 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 14 01:16:21.935927 ignition[939]: Ignition finished successfully Jan 14 01:16:21.954089 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 01:16:21.955165 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 01:16:21.956163 systemd[1]: Reached target basic.target - Basic System. Jan 14 01:16:21.958340 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 14 01:16:22.014976 systemd-fsck[947]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 14 01:16:22.022066 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 14 01:16:22.036905 kernel: audit: type=1130 audit(1768353382.022:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.026372 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 14 01:16:22.207180 kernel: EXT4-fs (sda9): mounted filesystem 9c98b0a3-27fc-41c4-a169-349b38bd9ceb r/w with ordered data mode. Quota mode: none. Jan 14 01:16:22.209286 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 14 01:16:22.211604 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 14 01:16:22.215546 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 01:16:22.219177 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 14 01:16:22.228541 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 14 01:16:22.232239 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 14 01:16:22.235333 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 01:16:22.248700 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 14 01:16:22.259171 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (955) Jan 14 01:16:22.261545 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 14 01:16:22.263818 systemd-networkd[698]: eth1: Gained IPv6LL Jan 14 01:16:22.277141 kernel: BTRFS info (device sda6): first mount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:16:22.277200 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:16:22.305605 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 14 01:16:22.305661 kernel: BTRFS info (device sda6): turning on async discard Jan 14 01:16:22.305672 kernel: BTRFS info (device sda6): enabling free space tree Jan 14 01:16:22.314650 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 01:16:22.322184 systemd-networkd[698]: eth0: Gained IPv6LL Jan 14 01:16:22.351453 coreos-metadata[957]: Jan 14 01:16:22.351 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 14 01:16:22.353319 coreos-metadata[957]: Jan 14 01:16:22.353 INFO Fetch successful Jan 14 01:16:22.353319 coreos-metadata[957]: Jan 14 01:16:22.353 INFO wrote hostname ci-4578-0-0-p-2c3a114250 to /sysroot/etc/hostname Jan 14 01:16:22.356464 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 14 01:16:22.357000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.364149 kernel: audit: type=1130 audit(1768353382.357:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.481181 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 14 01:16:22.488169 kernel: audit: type=1130 audit(1768353382.481:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.481000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.484186 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 14 01:16:22.497288 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 14 01:16:22.506206 kernel: BTRFS info (device sda6): last unmount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:16:22.531816 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 14 01:16:22.537124 ignition[1055]: INFO : Ignition 2.24.0 Jan 14 01:16:22.537124 ignition[1055]: INFO : Stage: mount Jan 14 01:16:22.537124 ignition[1055]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:16:22.537124 ignition[1055]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 01:16:22.540928 ignition[1055]: INFO : mount: mount passed Jan 14 01:16:22.540928 ignition[1055]: INFO : Ignition finished successfully Jan 14 01:16:22.542714 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 14 01:16:22.543000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.543457 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 14 01:16:22.544000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:22.547265 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 14 01:16:22.562178 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 01:16:22.586145 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1068) Jan 14 01:16:22.590524 kernel: BTRFS info (device sda6): first mount of filesystem 37f804f9-71c0-44d1-975c-4a397de322e7 Jan 14 01:16:22.590584 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 14 01:16:22.596966 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 14 01:16:22.597028 kernel: BTRFS info (device sda6): turning on async discard Jan 14 01:16:22.597039 kernel: BTRFS info (device sda6): enabling free space tree Jan 14 01:16:22.600517 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 01:16:22.628297 ignition[1084]: INFO : Ignition 2.24.0 Jan 14 01:16:22.628297 ignition[1084]: INFO : Stage: files Jan 14 01:16:22.630244 ignition[1084]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:16:22.630244 ignition[1084]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 01:16:22.630244 ignition[1084]: DEBUG : files: compiled without relabeling support, skipping Jan 14 01:16:22.630244 ignition[1084]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 14 01:16:22.630244 ignition[1084]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 14 01:16:22.634005 ignition[1084]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 14 01:16:22.634771 ignition[1084]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 14 01:16:22.634771 ignition[1084]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 14 01:16:22.634441 unknown[1084]: wrote ssh authorized keys file for user: core Jan 14 01:16:22.637077 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 14 01:16:22.637077 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jan 14 01:16:22.903905 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 14 01:16:23.210722 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 14 01:16:23.210722 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 14 01:16:23.213668 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 14 01:16:23.213668 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 14 01:16:23.213668 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 14 01:16:23.213668 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 01:16:23.213668 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 01:16:23.213668 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 01:16:23.213668 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 01:16:23.219449 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 01:16:23.219449 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 01:16:23.219449 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 14 01:16:23.219449 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 14 01:16:23.219449 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 14 01:16:23.219449 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Jan 14 01:16:23.633062 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 14 01:16:23.996001 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jan 14 01:16:23.996001 ignition[1084]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 14 01:16:23.998283 ignition[1084]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 01:16:24.001792 ignition[1084]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 01:16:24.001792 ignition[1084]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 14 01:16:24.001792 ignition[1084]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 14 01:16:24.005908 ignition[1084]: INFO : files: files passed Jan 14 01:16:24.005908 ignition[1084]: INFO : Ignition finished successfully Jan 14 01:16:24.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.005359 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 14 01:16:24.009246 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 14 01:16:24.012032 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 14 01:16:24.022057 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 14 01:16:24.022472 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 14 01:16:24.024000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.029496 initrd-setup-root-after-ignition[1117]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:16:24.029496 initrd-setup-root-after-ignition[1117]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:16:24.031442 initrd-setup-root-after-ignition[1121]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:16:24.033508 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 01:16:24.033000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.034384 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 14 01:16:24.036017 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 14 01:16:24.086548 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 14 01:16:24.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.087000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.086653 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 14 01:16:24.087359 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 14 01:16:24.087991 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 14 01:16:24.089039 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 14 01:16:24.091216 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 14 01:16:24.136042 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 01:16:24.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.139910 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 14 01:16:24.163026 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 01:16:24.163645 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:16:24.165271 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:16:24.166844 systemd[1]: Stopped target timers.target - Timer Units. Jan 14 01:16:24.168255 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 14 01:16:24.170000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.168634 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 01:16:24.171391 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 14 01:16:24.173805 systemd[1]: Stopped target basic.target - Basic System. Jan 14 01:16:24.175689 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 14 01:16:24.177262 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 01:16:24.178831 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 14 01:16:24.180048 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 14 01:16:24.181352 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 14 01:16:24.182713 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 01:16:24.183862 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 14 01:16:24.185299 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 14 01:16:24.186690 systemd[1]: Stopped target swap.target - Swaps. Jan 14 01:16:24.188132 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 14 01:16:24.188480 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 14 01:16:24.190000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.191504 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:16:24.192850 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:16:24.194301 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 14 01:16:24.194544 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:16:24.197000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.195832 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 14 01:16:24.196032 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 14 01:16:24.200000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.198658 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 14 01:16:24.202000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.199005 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 01:16:24.204000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.200709 systemd[1]: ignition-files.service: Deactivated successfully. Jan 14 01:16:24.200999 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 14 01:16:24.202509 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 14 01:16:24.202810 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 14 01:16:24.207425 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 14 01:16:24.210297 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 14 01:16:24.211464 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 14 01:16:24.212173 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:16:24.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.213522 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 14 01:16:24.214247 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:16:24.215000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.215552 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 14 01:16:24.216266 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 01:16:24.217000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.222271 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 14 01:16:24.223011 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 14 01:16:24.224000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.224000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.247929 ignition[1141]: INFO : Ignition 2.24.0 Jan 14 01:16:24.249203 ignition[1141]: INFO : Stage: umount Jan 14 01:16:24.249203 ignition[1141]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:16:24.249203 ignition[1141]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 01:16:24.252650 ignition[1141]: INFO : umount: umount passed Jan 14 01:16:24.252650 ignition[1141]: INFO : Ignition finished successfully Jan 14 01:16:24.254815 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 14 01:16:24.254967 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 14 01:16:24.256000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.256957 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 14 01:16:24.257536 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 14 01:16:24.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.258676 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 14 01:16:24.258733 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 14 01:16:24.260000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.260421 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 14 01:16:24.260961 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 14 01:16:24.261000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.262128 systemd[1]: Stopped target network.target - Network. Jan 14 01:16:24.263125 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 14 01:16:24.263595 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 01:16:24.264000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.264368 systemd[1]: Stopped target paths.target - Path Units. Jan 14 01:16:24.265024 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 14 01:16:24.270155 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:16:24.270864 systemd[1]: Stopped target slices.target - Slice Units. Jan 14 01:16:24.271499 systemd[1]: Stopped target sockets.target - Socket Units. Jan 14 01:16:24.272174 systemd[1]: iscsid.socket: Deactivated successfully. Jan 14 01:16:24.272213 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 01:16:24.273162 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 14 01:16:24.273208 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 01:16:24.273602 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 14 01:16:24.274000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.274000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.273628 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:16:24.273938 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 14 01:16:24.273981 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 14 01:16:24.274323 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 14 01:16:24.274358 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 14 01:16:24.274755 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 14 01:16:24.277232 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 14 01:16:24.278694 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 14 01:16:24.288025 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 14 01:16:24.288341 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 14 01:16:24.290000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.293698 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 14 01:16:24.294209 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 14 01:16:24.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.295000 audit: BPF prog-id=6 op=UNLOAD Jan 14 01:16:24.296866 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 14 01:16:24.297000 audit: BPF prog-id=9 op=UNLOAD Jan 14 01:16:24.297601 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 14 01:16:24.297640 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:16:24.301000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.301261 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 14 01:16:24.301586 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 14 01:16:24.301645 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 01:16:24.302013 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 14 01:16:24.302053 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:16:24.302401 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 14 01:16:24.302436 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 14 01:16:24.305139 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:16:24.321763 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 14 01:16:24.322756 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 14 01:16:24.324000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.324727 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 14 01:16:24.325002 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:16:24.326000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.328322 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 14 01:16:24.329092 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 14 01:16:24.330864 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 14 01:16:24.331778 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:16:24.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.332517 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 14 01:16:24.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.332605 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 14 01:16:24.336000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.334509 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 14 01:16:24.337000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.334558 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 14 01:16:24.335904 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 14 01:16:24.335950 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 01:16:24.337300 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 14 01:16:24.342000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.337341 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 14 01:16:24.343000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.340228 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 14 01:16:24.345000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.341221 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 14 01:16:24.346000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.341278 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:16:24.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.342604 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 14 01:16:24.342654 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:16:24.344175 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 14 01:16:24.344220 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 01:16:24.345278 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 14 01:16:24.345320 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:16:24.346606 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:16:24.359000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.346656 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:16:24.359044 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 14 01:16:24.359222 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 14 01:16:24.372983 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 14 01:16:24.373108 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 14 01:16:24.374000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:24.374562 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 14 01:16:24.375978 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 14 01:16:24.396169 systemd[1]: Switching root. Jan 14 01:16:24.459954 systemd-journald[320]: Journal stopped Jan 14 01:16:25.641114 systemd-journald[320]: Received SIGTERM from PID 1 (systemd). Jan 14 01:16:25.641188 kernel: SELinux: policy capability network_peer_controls=1 Jan 14 01:16:25.641204 kernel: SELinux: policy capability open_perms=1 Jan 14 01:16:25.641218 kernel: SELinux: policy capability extended_socket_class=1 Jan 14 01:16:25.641235 kernel: SELinux: policy capability always_check_network=0 Jan 14 01:16:25.641244 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 14 01:16:25.641253 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 14 01:16:25.641262 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 14 01:16:25.641274 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 14 01:16:25.641283 kernel: SELinux: policy capability userspace_initial_context=0 Jan 14 01:16:25.641293 systemd[1]: Successfully loaded SELinux policy in 71.587ms. Jan 14 01:16:25.641309 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.976ms. Jan 14 01:16:25.641319 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 01:16:25.641329 systemd[1]: Detected virtualization kvm. Jan 14 01:16:25.641341 systemd[1]: Detected architecture x86-64. Jan 14 01:16:25.641354 systemd[1]: Detected first boot. Jan 14 01:16:25.641363 systemd[1]: Hostname set to . Jan 14 01:16:25.641373 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 01:16:25.641383 zram_generator::config[1184]: No configuration found. Jan 14 01:16:25.641394 kernel: Guest personality initialized and is inactive Jan 14 01:16:25.641404 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 14 01:16:25.641415 kernel: Initialized host personality Jan 14 01:16:25.641424 kernel: NET: Registered PF_VSOCK protocol family Jan 14 01:16:25.641434 systemd[1]: Populated /etc with preset unit settings. Jan 14 01:16:25.641443 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 14 01:16:25.641453 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 14 01:16:25.641467 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 14 01:16:25.641480 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 14 01:16:25.641507 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 14 01:16:25.641517 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 14 01:16:25.641526 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 14 01:16:25.641536 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 14 01:16:25.641547 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 14 01:16:25.641561 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 14 01:16:25.641570 systemd[1]: Created slice user.slice - User and Session Slice. Jan 14 01:16:25.641584 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:16:25.641597 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:16:25.641607 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 14 01:16:25.641617 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 14 01:16:25.641627 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 14 01:16:25.641641 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 01:16:25.641651 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 14 01:16:25.641661 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:16:25.641671 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:16:25.641681 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 14 01:16:25.641690 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 14 01:16:25.641702 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 14 01:16:25.641712 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 14 01:16:25.641722 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:16:25.641732 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 01:16:25.641741 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 14 01:16:25.641751 systemd[1]: Reached target slices.target - Slice Units. Jan 14 01:16:25.641761 systemd[1]: Reached target swap.target - Swaps. Jan 14 01:16:25.641775 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 14 01:16:25.641785 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 14 01:16:25.641794 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 14 01:16:25.641804 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:16:25.641814 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 14 01:16:25.641823 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:16:25.641833 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 14 01:16:25.641845 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 14 01:16:25.641854 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 01:16:25.641864 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:16:25.641873 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 14 01:16:25.641883 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 14 01:16:25.641893 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 14 01:16:25.641903 systemd[1]: Mounting media.mount - External Media Directory... Jan 14 01:16:25.641914 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:16:25.641924 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 14 01:16:25.641933 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 14 01:16:25.641943 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 14 01:16:25.641952 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 14 01:16:25.641962 systemd[1]: Reached target machines.target - Containers. Jan 14 01:16:25.641971 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 14 01:16:25.641984 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:16:25.641994 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 01:16:25.642003 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 14 01:16:25.642013 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 01:16:25.642023 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 01:16:25.642032 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 01:16:25.642042 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 14 01:16:25.642057 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 01:16:25.642066 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 14 01:16:25.642076 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 14 01:16:25.642085 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 14 01:16:25.642137 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 14 01:16:25.642149 systemd[1]: Stopped systemd-fsck-usr.service. Jan 14 01:16:25.642161 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:16:25.642174 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 01:16:25.642184 kernel: fuse: init (API version 7.41) Jan 14 01:16:25.642194 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 01:16:25.642205 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 01:16:25.642215 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 14 01:16:25.642225 kernel: ACPI: bus type drm_connector registered Jan 14 01:16:25.642235 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 14 01:16:25.642245 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 01:16:25.642258 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:16:25.642286 systemd-journald[1255]: Collecting audit messages is enabled. Jan 14 01:16:25.642304 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 14 01:16:25.642314 systemd-journald[1255]: Journal started Jan 14 01:16:25.642331 systemd-journald[1255]: Runtime Journal (/run/log/journal/a534da6f2c294c32971dde88655a083a) is 8M, max 76M, 68M free. Jan 14 01:16:25.647515 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 14 01:16:25.647560 systemd[1]: Mounted media.mount - External Media Directory. Jan 14 01:16:25.390000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 14 01:16:25.553000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.560000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.568000 audit: BPF prog-id=14 op=UNLOAD Jan 14 01:16:25.568000 audit: BPF prog-id=13 op=UNLOAD Jan 14 01:16:25.569000 audit: BPF prog-id=15 op=LOAD Jan 14 01:16:25.570000 audit: BPF prog-id=16 op=LOAD Jan 14 01:16:25.570000 audit: BPF prog-id=17 op=LOAD Jan 14 01:16:25.639000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 14 01:16:25.639000 audit[1255]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffe55fe85e0 a2=4000 a3=0 items=0 ppid=1 pid=1255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:25.639000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 14 01:16:25.304772 systemd[1]: Queued start job for default target multi-user.target. Jan 14 01:16:25.318332 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 14 01:16:25.318848 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 14 01:16:25.655684 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 01:16:25.656000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.659039 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 14 01:16:25.661378 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 14 01:16:25.664000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.663307 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 14 01:16:25.664258 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:16:25.665396 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 14 01:16:25.665727 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 14 01:16:25.667000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.667000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.667693 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 01:16:25.667894 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 01:16:25.668000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.668000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.669060 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 01:16:25.669610 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 01:16:25.670000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.670000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.670894 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 01:16:25.672000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.672000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.674000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.674000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.672398 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 01:16:25.673328 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 14 01:16:25.673547 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 14 01:16:25.674555 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 01:16:25.674759 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 01:16:25.676000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.676000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.678000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.679000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.676721 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 01:16:25.678879 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:16:25.680813 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 14 01:16:25.681000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.682219 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 14 01:16:25.682000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.683653 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 14 01:16:25.684000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.699903 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 01:16:25.701919 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 14 01:16:25.704236 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 14 01:16:25.708270 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 14 01:16:25.708941 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 14 01:16:25.709022 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 01:16:25.712231 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 14 01:16:25.713470 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:16:25.714203 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:16:25.723271 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 14 01:16:25.726224 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 14 01:16:25.726635 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 01:16:25.729439 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 14 01:16:25.730011 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 01:16:25.734253 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 01:16:25.740251 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 14 01:16:25.742215 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 01:16:25.745806 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 14 01:16:25.747447 systemd-journald[1255]: Time spent on flushing to /var/log/journal/a534da6f2c294c32971dde88655a083a is 44.965ms for 1366 entries. Jan 14 01:16:25.747447 systemd-journald[1255]: System Journal (/var/log/journal/a534da6f2c294c32971dde88655a083a) is 8M, max 588.1M, 580.1M free. Jan 14 01:16:25.802360 systemd-journald[1255]: Received client request to flush runtime journal. Jan 14 01:16:25.802551 kernel: loop1: detected capacity change from 0 to 50784 Jan 14 01:16:25.768000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.780000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.746286 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 14 01:16:25.768288 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 14 01:16:25.807000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.768887 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 14 01:16:25.772394 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 14 01:16:25.780482 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:16:25.803537 systemd-tmpfiles[1311]: ACLs are not supported, ignoring. Jan 14 01:16:25.803547 systemd-tmpfiles[1311]: ACLs are not supported, ignoring. Jan 14 01:16:25.806611 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 14 01:16:25.815000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.814250 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 01:16:25.819790 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 14 01:16:25.836000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.836631 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 14 01:16:25.846122 kernel: loop2: detected capacity change from 0 to 8 Jan 14 01:16:25.853326 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:16:25.854000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.862200 kernel: loop3: detected capacity change from 0 to 229808 Jan 14 01:16:25.880515 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 14 01:16:25.881000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.882000 audit: BPF prog-id=18 op=LOAD Jan 14 01:16:25.883000 audit: BPF prog-id=19 op=LOAD Jan 14 01:16:25.883000 audit: BPF prog-id=20 op=LOAD Jan 14 01:16:25.887000 audit: BPF prog-id=21 op=LOAD Jan 14 01:16:25.886283 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 14 01:16:25.891238 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 01:16:25.894206 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 01:16:25.896000 audit: BPF prog-id=22 op=LOAD Jan 14 01:16:25.897000 audit: BPF prog-id=23 op=LOAD Jan 14 01:16:25.897000 audit: BPF prog-id=24 op=LOAD Jan 14 01:16:25.897867 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 14 01:16:25.899355 kernel: loop4: detected capacity change from 0 to 111560 Jan 14 01:16:25.903000 audit: BPF prog-id=25 op=LOAD Jan 14 01:16:25.903000 audit: BPF prog-id=26 op=LOAD Jan 14 01:16:25.903000 audit: BPF prog-id=27 op=LOAD Jan 14 01:16:25.904545 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 14 01:16:25.940307 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Jan 14 01:16:25.941123 kernel: loop5: detected capacity change from 0 to 50784 Jan 14 01:16:25.940659 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Jan 14 01:16:25.949000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.949265 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:16:25.959734 systemd-nsresourced[1337]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 14 01:16:25.969146 kernel: loop6: detected capacity change from 0 to 8 Jan 14 01:16:25.976149 kernel: loop7: detected capacity change from 0 to 229808 Jan 14 01:16:25.979000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:25.977459 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 14 01:16:25.986082 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 14 01:16:25.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.003277 kernel: loop1: detected capacity change from 0 to 111560 Jan 14 01:16:26.021875 (sd-merge)[1341]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Jan 14 01:16:26.027329 (sd-merge)[1341]: Merged extensions into '/usr'. Jan 14 01:16:26.037247 systemd[1]: Reload requested from client PID 1310 ('systemd-sysext') (unit systemd-sysext.service)... Jan 14 01:16:26.037264 systemd[1]: Reloading... Jan 14 01:16:26.059195 systemd-oomd[1334]: No swap; memory pressure usage will be degraded Jan 14 01:16:26.106041 systemd-resolved[1335]: Positive Trust Anchors: Jan 14 01:16:26.106059 systemd-resolved[1335]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 01:16:26.106064 systemd-resolved[1335]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 01:16:26.106086 systemd-resolved[1335]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 01:16:26.124327 systemd-resolved[1335]: Using system hostname 'ci-4578-0-0-p-2c3a114250'. Jan 14 01:16:26.145129 zram_generator::config[1389]: No configuration found. Jan 14 01:16:26.304690 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 14 01:16:26.305050 systemd[1]: Reloading finished in 267 ms. Jan 14 01:16:26.329020 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 14 01:16:26.329000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.329881 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 01:16:26.330000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.330623 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 14 01:16:26.330000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.331557 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 14 01:16:26.331000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.335166 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:16:26.343447 systemd[1]: Starting ensure-sysext.service... Jan 14 01:16:26.346000 audit: BPF prog-id=8 op=UNLOAD Jan 14 01:16:26.346000 audit: BPF prog-id=7 op=UNLOAD Jan 14 01:16:26.346000 audit: BPF prog-id=28 op=LOAD Jan 14 01:16:26.346000 audit: BPF prog-id=29 op=LOAD Jan 14 01:16:26.346208 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 01:16:26.351000 audit: BPF prog-id=30 op=LOAD Jan 14 01:16:26.351000 audit: BPF prog-id=21 op=UNLOAD Jan 14 01:16:26.350392 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:16:26.352000 audit: BPF prog-id=31 op=LOAD Jan 14 01:16:26.361000 audit: BPF prog-id=25 op=UNLOAD Jan 14 01:16:26.361000 audit: BPF prog-id=32 op=LOAD Jan 14 01:16:26.361000 audit: BPF prog-id=33 op=LOAD Jan 14 01:16:26.361000 audit: BPF prog-id=26 op=UNLOAD Jan 14 01:16:26.361000 audit: BPF prog-id=27 op=UNLOAD Jan 14 01:16:26.362000 audit: BPF prog-id=34 op=LOAD Jan 14 01:16:26.362000 audit: BPF prog-id=15 op=UNLOAD Jan 14 01:16:26.362000 audit: BPF prog-id=35 op=LOAD Jan 14 01:16:26.362000 audit: BPF prog-id=36 op=LOAD Jan 14 01:16:26.362000 audit: BPF prog-id=16 op=UNLOAD Jan 14 01:16:26.362000 audit: BPF prog-id=17 op=UNLOAD Jan 14 01:16:26.364000 audit: BPF prog-id=37 op=LOAD Jan 14 01:16:26.364000 audit: BPF prog-id=18 op=UNLOAD Jan 14 01:16:26.364000 audit: BPF prog-id=38 op=LOAD Jan 14 01:16:26.364000 audit: BPF prog-id=39 op=LOAD Jan 14 01:16:26.364000 audit: BPF prog-id=19 op=UNLOAD Jan 14 01:16:26.364000 audit: BPF prog-id=20 op=UNLOAD Jan 14 01:16:26.365000 audit: BPF prog-id=40 op=LOAD Jan 14 01:16:26.365000 audit: BPF prog-id=22 op=UNLOAD Jan 14 01:16:26.365000 audit: BPF prog-id=41 op=LOAD Jan 14 01:16:26.365000 audit: BPF prog-id=42 op=LOAD Jan 14 01:16:26.365000 audit: BPF prog-id=23 op=UNLOAD Jan 14 01:16:26.365000 audit: BPF prog-id=24 op=UNLOAD Jan 14 01:16:26.384983 systemd[1]: Reload requested from client PID 1429 ('systemctl') (unit ensure-sysext.service)... Jan 14 01:16:26.385008 systemd[1]: Reloading... Jan 14 01:16:26.388694 systemd-tmpfiles[1430]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 14 01:16:26.388745 systemd-tmpfiles[1430]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 14 01:16:26.389275 systemd-tmpfiles[1430]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 14 01:16:26.393841 systemd-tmpfiles[1430]: ACLs are not supported, ignoring. Jan 14 01:16:26.394619 systemd-tmpfiles[1430]: ACLs are not supported, ignoring. Jan 14 01:16:26.414995 systemd-tmpfiles[1430]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 01:16:26.416363 systemd-tmpfiles[1430]: Skipping /boot Jan 14 01:16:26.438564 systemd-udevd[1431]: Using default interface naming scheme 'v257'. Jan 14 01:16:26.440067 systemd-tmpfiles[1430]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 01:16:26.441196 systemd-tmpfiles[1430]: Skipping /boot Jan 14 01:16:26.480135 zram_generator::config[1466]: No configuration found. Jan 14 01:16:26.611258 kernel: mousedev: PS/2 mouse device common for all mice Jan 14 01:16:26.687133 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Jan 14 01:16:26.698144 kernel: ACPI: button: Power Button [PWRF] Jan 14 01:16:26.725070 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 14 01:16:26.725517 systemd[1]: Reloading finished in 340 ms. Jan 14 01:16:26.742203 kernel: kauditd_printk_skb: 148 callbacks suppressed Jan 14 01:16:26.742271 kernel: audit: type=1130 audit(1768353386.736:186): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.736000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.736412 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:16:26.744027 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:16:26.756154 kernel: audit: type=1130 audit(1768353386.744:187): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.744000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.748000 audit: BPF prog-id=43 op=LOAD Jan 14 01:16:26.760222 kernel: audit: type=1334 audit(1768353386.748:188): prog-id=43 op=LOAD Jan 14 01:16:26.760258 kernel: audit: type=1334 audit(1768353386.753:189): prog-id=34 op=UNLOAD Jan 14 01:16:26.753000 audit: BPF prog-id=34 op=UNLOAD Jan 14 01:16:26.761714 kernel: audit: type=1334 audit(1768353386.753:190): prog-id=44 op=LOAD Jan 14 01:16:26.753000 audit: BPF prog-id=44 op=LOAD Jan 14 01:16:26.763320 kernel: audit: type=1334 audit(1768353386.753:191): prog-id=45 op=LOAD Jan 14 01:16:26.753000 audit: BPF prog-id=45 op=LOAD Jan 14 01:16:26.774130 kernel: audit: type=1334 audit(1768353386.753:192): prog-id=35 op=UNLOAD Jan 14 01:16:26.774203 kernel: audit: type=1334 audit(1768353386.753:193): prog-id=36 op=UNLOAD Jan 14 01:16:26.753000 audit: BPF prog-id=35 op=UNLOAD Jan 14 01:16:26.753000 audit: BPF prog-id=36 op=UNLOAD Jan 14 01:16:26.754000 audit: BPF prog-id=46 op=LOAD Jan 14 01:16:26.754000 audit: BPF prog-id=31 op=UNLOAD Jan 14 01:16:26.779968 kernel: audit: type=1334 audit(1768353386.754:194): prog-id=46 op=LOAD Jan 14 01:16:26.780001 kernel: audit: type=1334 audit(1768353386.754:195): prog-id=31 op=UNLOAD Jan 14 01:16:26.754000 audit: BPF prog-id=47 op=LOAD Jan 14 01:16:26.754000 audit: BPF prog-id=48 op=LOAD Jan 14 01:16:26.754000 audit: BPF prog-id=32 op=UNLOAD Jan 14 01:16:26.754000 audit: BPF prog-id=33 op=UNLOAD Jan 14 01:16:26.754000 audit: BPF prog-id=49 op=LOAD Jan 14 01:16:26.754000 audit: BPF prog-id=30 op=UNLOAD Jan 14 01:16:26.755000 audit: BPF prog-id=50 op=LOAD Jan 14 01:16:26.755000 audit: BPF prog-id=37 op=UNLOAD Jan 14 01:16:26.755000 audit: BPF prog-id=51 op=LOAD Jan 14 01:16:26.755000 audit: BPF prog-id=52 op=LOAD Jan 14 01:16:26.755000 audit: BPF prog-id=38 op=UNLOAD Jan 14 01:16:26.755000 audit: BPF prog-id=39 op=UNLOAD Jan 14 01:16:26.765000 audit: BPF prog-id=53 op=LOAD Jan 14 01:16:26.765000 audit: BPF prog-id=40 op=UNLOAD Jan 14 01:16:26.765000 audit: BPF prog-id=54 op=LOAD Jan 14 01:16:26.765000 audit: BPF prog-id=55 op=LOAD Jan 14 01:16:26.765000 audit: BPF prog-id=41 op=UNLOAD Jan 14 01:16:26.765000 audit: BPF prog-id=42 op=UNLOAD Jan 14 01:16:26.765000 audit: BPF prog-id=56 op=LOAD Jan 14 01:16:26.765000 audit: BPF prog-id=57 op=LOAD Jan 14 01:16:26.765000 audit: BPF prog-id=28 op=UNLOAD Jan 14 01:16:26.765000 audit: BPF prog-id=29 op=UNLOAD Jan 14 01:16:26.794140 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 14 01:16:26.798111 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 14 01:16:26.798340 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 14 01:16:26.807370 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 14 01:16:26.822425 kernel: EDAC MC: Ver: 3.0.0 Jan 14 01:16:26.829131 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 14 01:16:26.831551 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 14 01:16:26.836906 kernel: Console: switching to colour dummy device 80x25 Jan 14 01:16:26.836960 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 14 01:16:26.843605 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 14 01:16:26.843644 kernel: [drm] features: -context_init Jan 14 01:16:26.844429 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:16:26.847137 kernel: [drm] number of scanouts: 1 Jan 14 01:16:26.847386 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 01:16:26.852185 kernel: [drm] number of cap sets: 0 Jan 14 01:16:26.852214 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 14 01:16:26.850238 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 14 01:16:26.862038 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 14 01:16:26.862090 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 01:16:26.920920 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 14 01:16:26.921981 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:16:26.925158 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 01:16:26.931919 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 01:16:26.935772 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 01:16:26.936916 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:16:26.937069 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:16:26.941146 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 14 01:16:26.945634 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 14 01:16:26.946767 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:16:26.953000 audit: BPF prog-id=58 op=LOAD Jan 14 01:16:26.952342 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 14 01:16:26.955479 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 01:16:26.959289 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 14 01:16:26.959395 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:16:26.976790 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:16:26.977739 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:16:26.981080 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 01:16:26.983283 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:16:26.983631 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:16:26.983827 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:16:26.984004 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 14 01:16:26.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.993000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:26.990666 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 01:16:26.993159 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 01:16:26.995394 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 01:16:27.008000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.000803 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:16:27.008166 systemd[1]: Finished ensure-sysext.service. Jan 14 01:16:27.022000 audit: BPF prog-id=59 op=LOAD Jan 14 01:16:27.024032 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 14 01:16:27.025615 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 01:16:27.027000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.027000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.026848 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 01:16:27.050000 audit[1575]: SYSTEM_BOOT pid=1575 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.054000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.054524 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 01:16:27.054735 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 01:16:27.062745 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 01:16:27.065000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.065000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.064010 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 01:16:27.067413 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 01:16:27.070517 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 14 01:16:27.071000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.084782 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 14 01:16:27.085000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.092890 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:16:27.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.095000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.094305 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:16:27.100745 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:16:27.117982 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 14 01:16:27.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:27.162000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 14 01:16:27.162000 audit[1614]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffe9f16c5d0 a2=420 a3=0 items=0 ppid=1558 pid=1614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:27.162000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:16:27.163306 augenrules[1614]: No rules Jan 14 01:16:27.166838 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 01:16:27.167133 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 01:16:27.177945 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 14 01:16:27.179989 systemd[1]: Reached target time-set.target - System Time Set. Jan 14 01:16:27.198114 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 14 01:16:27.199755 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 14 01:16:27.200052 systemd-networkd[1572]: lo: Link UP Jan 14 01:16:27.200059 systemd-networkd[1572]: lo: Gained carrier Jan 14 01:16:27.205326 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 01:16:27.205414 systemd-networkd[1572]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:27.205452 systemd-networkd[1572]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:16:27.206485 systemd[1]: Reached target network.target - Network. Jan 14 01:16:27.210264 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 14 01:16:27.210914 systemd-networkd[1572]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:27.210919 systemd-networkd[1572]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:16:27.214263 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 14 01:16:27.218886 systemd-networkd[1572]: eth0: Link UP Jan 14 01:16:27.219366 systemd-networkd[1572]: eth0: Gained carrier Jan 14 01:16:27.219423 systemd-networkd[1572]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:27.236068 systemd-networkd[1572]: eth1: Link UP Jan 14 01:16:27.238089 systemd-networkd[1572]: eth1: Gained carrier Jan 14 01:16:27.239152 systemd-networkd[1572]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:16:27.251936 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:16:27.256463 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 14 01:16:27.277187 systemd-networkd[1572]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 14 01:16:27.279279 systemd-timesyncd[1583]: Network configuration changed, trying to establish connection. Jan 14 01:16:27.306207 systemd-networkd[1572]: eth0: DHCPv4 address 77.42.79.167/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 14 01:16:27.307758 systemd-timesyncd[1583]: Network configuration changed, trying to establish connection. Jan 14 01:16:27.543466 ldconfig[1566]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 14 01:16:27.548396 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 14 01:16:27.552418 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 14 01:16:27.578366 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 14 01:16:27.580701 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 01:16:27.581268 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 14 01:16:27.583203 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 14 01:16:27.583578 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 14 01:16:27.584223 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 14 01:16:27.584808 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 14 01:16:27.585301 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 14 01:16:27.585861 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 14 01:16:27.588651 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 14 01:16:27.589684 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 14 01:16:27.589742 systemd[1]: Reached target paths.target - Path Units. Jan 14 01:16:27.590715 systemd[1]: Reached target timers.target - Timer Units. Jan 14 01:16:27.593935 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 14 01:16:27.598867 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 14 01:16:27.606000 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 14 01:16:27.610558 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 14 01:16:27.611567 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 14 01:16:27.624009 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 14 01:16:27.627681 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 14 01:16:27.628892 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 14 01:16:27.631522 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 01:16:27.632712 systemd[1]: Reached target basic.target - Basic System. Jan 14 01:16:27.633084 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 14 01:16:27.633120 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 14 01:16:27.634255 systemd[1]: Starting containerd.service - containerd container runtime... Jan 14 01:16:27.639237 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 14 01:16:27.642452 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 14 01:16:27.647232 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 14 01:16:27.651192 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 14 01:16:27.657779 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 14 01:16:27.659573 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 14 01:16:27.663643 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 14 01:16:27.667905 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 14 01:16:27.672224 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 14 01:16:27.683279 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 14 01:16:27.689409 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 14 01:16:27.694492 jq[1638]: false Jan 14 01:16:27.694342 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 14 01:16:27.711881 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 14 01:16:27.714191 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 14 01:16:27.725205 extend-filesystems[1639]: Found /dev/sda6 Jan 14 01:16:27.723902 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 14 01:16:27.752969 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Refreshing passwd entry cache Jan 14 01:16:27.752969 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Failure getting users, quitting Jan 14 01:16:27.752969 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 14 01:16:27.752969 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Refreshing group entry cache Jan 14 01:16:27.752969 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Failure getting groups, quitting Jan 14 01:16:27.752969 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 14 01:16:27.731220 oslogin_cache_refresh[1640]: Refreshing passwd entry cache Jan 14 01:16:27.757523 coreos-metadata[1635]: Jan 14 01:16:27.739 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 14 01:16:27.757523 coreos-metadata[1635]: Jan 14 01:16:27.751 INFO Fetch successful Jan 14 01:16:27.757523 coreos-metadata[1635]: Jan 14 01:16:27.752 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 14 01:16:27.757523 coreos-metadata[1635]: Jan 14 01:16:27.752 INFO Fetch successful Jan 14 01:16:27.727492 systemd[1]: Starting update-engine.service - Update Engine... Jan 14 01:16:27.751529 oslogin_cache_refresh[1640]: Failure getting users, quitting Jan 14 01:16:27.735424 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 14 01:16:27.751554 oslogin_cache_refresh[1640]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 14 01:16:27.746181 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 14 01:16:27.751607 oslogin_cache_refresh[1640]: Refreshing group entry cache Jan 14 01:16:27.751640 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 14 01:16:27.752394 oslogin_cache_refresh[1640]: Failure getting groups, quitting Jan 14 01:16:27.753529 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 14 01:16:27.752406 oslogin_cache_refresh[1640]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 14 01:16:27.755421 systemd[1]: motdgen.service: Deactivated successfully. Jan 14 01:16:27.757404 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 14 01:16:27.761078 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 14 01:16:27.762712 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 14 01:16:27.769479 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 14 01:16:27.769864 extend-filesystems[1639]: Found /dev/sda9 Jan 14 01:16:27.773271 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 14 01:16:27.799948 extend-filesystems[1639]: Checking size of /dev/sda9 Jan 14 01:16:27.809720 jq[1660]: true Jan 14 01:16:27.840909 tar[1668]: linux-amd64/LICENSE Jan 14 01:16:27.844155 tar[1668]: linux-amd64/helm Jan 14 01:16:27.844838 extend-filesystems[1639]: Resized partition /dev/sda9 Jan 14 01:16:27.850159 update_engine[1658]: I20260114 01:16:27.848531 1658 main.cc:92] Flatcar Update Engine starting Jan 14 01:16:27.862422 extend-filesystems[1696]: resize2fs 1.47.3 (8-Jul-2025) Jan 14 01:16:27.870163 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 18410491 blocks Jan 14 01:16:27.882064 jq[1688]: true Jan 14 01:16:27.924344 dbus-daemon[1636]: [system] SELinux support is enabled Jan 14 01:16:27.924655 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 14 01:16:27.929128 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 14 01:16:27.929156 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 14 01:16:27.931453 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 14 01:16:27.931475 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 14 01:16:27.956932 systemd[1]: Started update-engine.service - Update Engine. Jan 14 01:16:27.958234 update_engine[1658]: I20260114 01:16:27.957258 1658 update_check_scheduler.cc:74] Next update check in 2m56s Jan 14 01:16:27.981687 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 14 01:16:27.984454 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 14 01:16:27.986405 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 14 01:16:28.064752 systemd-logind[1657]: New seat seat0. Jan 14 01:16:28.077788 systemd-logind[1657]: Watching system buttons on /dev/input/event3 (Power Button) Jan 14 01:16:28.077812 systemd-logind[1657]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 14 01:16:28.078119 systemd[1]: Started systemd-logind.service - User Login Management. Jan 14 01:16:28.101481 bash[1724]: Updated "/home/core/.ssh/authorized_keys" Jan 14 01:16:28.103656 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 14 01:16:28.112426 systemd[1]: Starting sshkeys.service... Jan 14 01:16:28.142823 kernel: EXT4-fs (sda9): resized filesystem to 18410491 Jan 14 01:16:28.142880 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 14 01:16:28.146797 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 14 01:16:28.177464 extend-filesystems[1696]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 14 01:16:28.177464 extend-filesystems[1696]: old_desc_blocks = 1, new_desc_blocks = 9 Jan 14 01:16:28.177464 extend-filesystems[1696]: The filesystem on /dev/sda9 is now 18410491 (4k) blocks long. Jan 14 01:16:28.187046 extend-filesystems[1639]: Resized filesystem in /dev/sda9 Jan 14 01:16:28.180247 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 14 01:16:28.181535 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 14 01:16:28.197238 containerd[1677]: time="2026-01-14T01:16:28Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 14 01:16:28.199130 containerd[1677]: time="2026-01-14T01:16:28.198617076Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 14 01:16:28.209811 containerd[1677]: time="2026-01-14T01:16:28.209767730Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.4µs" Jan 14 01:16:28.210226 containerd[1677]: time="2026-01-14T01:16:28.210207361Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211193131Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211222381Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211366541Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211380471Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211436801Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211445621Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211682831Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211695541Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211704141Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211712671Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211846411Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 01:16:28.213952 containerd[1677]: time="2026-01-14T01:16:28.211857121Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 14 01:16:28.214188 containerd[1677]: time="2026-01-14T01:16:28.211923551Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 14 01:16:28.214188 containerd[1677]: time="2026-01-14T01:16:28.212085711Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 01:16:28.224420 containerd[1677]: time="2026-01-14T01:16:28.222207846Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 01:16:28.224420 containerd[1677]: time="2026-01-14T01:16:28.222233406Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 14 01:16:28.224420 containerd[1677]: time="2026-01-14T01:16:28.222265516Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 14 01:16:28.224420 containerd[1677]: time="2026-01-14T01:16:28.223351896Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 14 01:16:28.224420 containerd[1677]: time="2026-01-14T01:16:28.223438236Z" level=info msg="metadata content store policy set" policy=shared Jan 14 01:16:28.225431 coreos-metadata[1727]: Jan 14 01:16:28.225 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 14 01:16:28.230327 coreos-metadata[1727]: Jan 14 01:16:28.229 INFO Fetch successful Jan 14 01:16:28.232113 containerd[1677]: time="2026-01-14T01:16:28.232036120Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 14 01:16:28.232219 containerd[1677]: time="2026-01-14T01:16:28.232208160Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 01:16:28.233256 containerd[1677]: time="2026-01-14T01:16:28.232307320Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 01:16:28.233755 containerd[1677]: time="2026-01-14T01:16:28.233741170Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 14 01:16:28.233811 containerd[1677]: time="2026-01-14T01:16:28.233802621Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 14 01:16:28.233847 containerd[1677]: time="2026-01-14T01:16:28.233838251Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.234192241Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.234208381Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236124041Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236142791Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236154311Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236164501Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236172641Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236193111Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236303442Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236317912Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236329902Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236338052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236346042Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 14 01:16:28.236834 containerd[1677]: time="2026-01-14T01:16:28.236353212Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236363162Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236375412Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236383972Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236392052Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236399952Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236418082Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236455412Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236465032Z" level=info msg="Start snapshots syncer" Jan 14 01:16:28.237050 containerd[1677]: time="2026-01-14T01:16:28.236677032Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 14 01:16:28.237532 containerd[1677]: time="2026-01-14T01:16:28.237367762Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 14 01:16:28.237532 containerd[1677]: time="2026-01-14T01:16:28.237408202Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 14 01:16:28.237785 containerd[1677]: time="2026-01-14T01:16:28.237765702Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 14 01:16:28.237923 containerd[1677]: time="2026-01-14T01:16:28.237907542Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 14 01:16:28.238198 containerd[1677]: time="2026-01-14T01:16:28.238183712Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 14 01:16:28.238312 unknown[1727]: wrote ssh authorized keys file for user: core Jan 14 01:16:28.242471 containerd[1677]: time="2026-01-14T01:16:28.238797993Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 14 01:16:28.242471 containerd[1677]: time="2026-01-14T01:16:28.238816613Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 14 01:16:28.242471 containerd[1677]: time="2026-01-14T01:16:28.238828993Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 14 01:16:28.242471 containerd[1677]: time="2026-01-14T01:16:28.238837393Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 14 01:16:28.242471 containerd[1677]: time="2026-01-14T01:16:28.238853183Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242665994Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242706184Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242741044Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242752584Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242759144Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242767154Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242773464Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242783564Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242792644Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242804744Z" level=info msg="runtime interface created" Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242809414Z" level=info msg="created NRI interface" Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242815564Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242852724Z" level=info msg="Connect containerd service" Jan 14 01:16:28.243845 containerd[1677]: time="2026-01-14T01:16:28.242875234Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 14 01:16:28.244053 containerd[1677]: time="2026-01-14T01:16:28.243475735Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 01:16:28.319413 update-ssh-keys[1740]: Updated "/home/core/.ssh/authorized_keys" Jan 14 01:16:28.322490 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 14 01:16:28.322874 locksmithd[1706]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 14 01:16:28.327068 systemd[1]: Finished sshkeys.service. Jan 14 01:16:28.381623 containerd[1677]: time="2026-01-14T01:16:28.381585942Z" level=info msg="Start subscribing containerd event" Jan 14 01:16:28.381837 containerd[1677]: time="2026-01-14T01:16:28.381817802Z" level=info msg="Start recovering state" Jan 14 01:16:28.382512 containerd[1677]: time="2026-01-14T01:16:28.382489892Z" level=info msg="Start event monitor" Jan 14 01:16:28.382786 containerd[1677]: time="2026-01-14T01:16:28.382684523Z" level=info msg="Start cni network conf syncer for default" Jan 14 01:16:28.382786 containerd[1677]: time="2026-01-14T01:16:28.382696283Z" level=info msg="Start streaming server" Jan 14 01:16:28.382786 containerd[1677]: time="2026-01-14T01:16:28.382704513Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 14 01:16:28.382786 containerd[1677]: time="2026-01-14T01:16:28.382710643Z" level=info msg="runtime interface starting up..." Jan 14 01:16:28.382786 containerd[1677]: time="2026-01-14T01:16:28.382715543Z" level=info msg="starting plugins..." Jan 14 01:16:28.382786 containerd[1677]: time="2026-01-14T01:16:28.382729363Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 14 01:16:28.384397 containerd[1677]: time="2026-01-14T01:16:28.384375763Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 14 01:16:28.384708 containerd[1677]: time="2026-01-14T01:16:28.384594033Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 14 01:16:28.385399 containerd[1677]: time="2026-01-14T01:16:28.385349614Z" level=info msg="containerd successfully booted in 0.188955s" Jan 14 01:16:28.385594 systemd[1]: Started containerd.service - containerd container runtime. Jan 14 01:16:28.460265 systemd-networkd[1572]: eth1: Gained IPv6LL Jan 14 01:16:28.460793 systemd-timesyncd[1583]: Network configuration changed, trying to establish connection. Jan 14 01:16:28.463663 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 14 01:16:28.464986 systemd[1]: Reached target network-online.target - Network is Online. Jan 14 01:16:28.472395 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:16:28.477071 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 14 01:16:28.520363 sshd_keygen[1665]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 14 01:16:28.533284 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 14 01:16:28.545572 tar[1668]: linux-amd64/README.md Jan 14 01:16:28.557256 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 14 01:16:28.561449 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 14 01:16:28.563849 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 14 01:16:28.578965 systemd[1]: issuegen.service: Deactivated successfully. Jan 14 01:16:28.579395 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 14 01:16:28.582401 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 14 01:16:28.602553 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 14 01:16:28.607361 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 14 01:16:28.612465 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 14 01:16:28.613031 systemd[1]: Reached target getty.target - Login Prompts. Jan 14 01:16:29.227911 systemd-networkd[1572]: eth0: Gained IPv6LL Jan 14 01:16:29.228864 systemd-timesyncd[1583]: Network configuration changed, trying to establish connection. Jan 14 01:16:29.701841 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:16:29.706325 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 14 01:16:29.711086 systemd[1]: Startup finished in 3.242s (kernel) + 5.699s (initrd) + 5.178s (userspace) = 14.119s. Jan 14 01:16:29.719823 (kubelet)[1792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:16:30.475943 kubelet[1792]: E0114 01:16:30.475849 1792 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:16:30.481602 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:16:30.481873 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:16:30.483028 systemd[1]: kubelet.service: Consumed 1.287s CPU time, 267.6M memory peak. Jan 14 01:16:32.307919 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 14 01:16:32.312213 systemd[1]: Started sshd@0-77.42.79.167:22-68.220.241.50:48174.service - OpenSSH per-connection server daemon (68.220.241.50:48174). Jan 14 01:16:33.033258 sshd[1805]: Accepted publickey for core from 68.220.241.50 port 48174 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:16:33.034287 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:33.045861 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 14 01:16:33.048305 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 14 01:16:33.060002 systemd-logind[1657]: New session 1 of user core. Jan 14 01:16:33.077013 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 14 01:16:33.083180 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 14 01:16:33.106209 (systemd)[1811]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:33.110591 systemd-logind[1657]: New session 2 of user core. Jan 14 01:16:33.253328 systemd[1811]: Queued start job for default target default.target. Jan 14 01:16:33.265306 systemd[1811]: Created slice app.slice - User Application Slice. Jan 14 01:16:33.265342 systemd[1811]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 14 01:16:33.265359 systemd[1811]: Reached target paths.target - Paths. Jan 14 01:16:33.265410 systemd[1811]: Reached target timers.target - Timers. Jan 14 01:16:33.266916 systemd[1811]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 14 01:16:33.269395 systemd[1811]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 14 01:16:33.276231 systemd[1811]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 14 01:16:33.276803 systemd[1811]: Reached target sockets.target - Sockets. Jan 14 01:16:33.302857 systemd[1811]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 14 01:16:33.302966 systemd[1811]: Reached target basic.target - Basic System. Jan 14 01:16:33.303022 systemd[1811]: Reached target default.target - Main User Target. Jan 14 01:16:33.303052 systemd[1811]: Startup finished in 183ms. Jan 14 01:16:33.303573 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 14 01:16:33.309878 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 14 01:16:33.697761 systemd[1]: Started sshd@1-77.42.79.167:22-68.220.241.50:48190.service - OpenSSH per-connection server daemon (68.220.241.50:48190). Jan 14 01:16:34.362265 sshd[1825]: Accepted publickey for core from 68.220.241.50 port 48190 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:16:34.364926 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:34.372605 systemd-logind[1657]: New session 3 of user core. Jan 14 01:16:34.380513 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 14 01:16:34.733179 sshd[1829]: Connection closed by 68.220.241.50 port 48190 Jan 14 01:16:34.733968 sshd-session[1825]: pam_unix(sshd:session): session closed for user core Jan 14 01:16:34.738006 systemd-logind[1657]: Session 3 logged out. Waiting for processes to exit. Jan 14 01:16:34.738221 systemd[1]: sshd@1-77.42.79.167:22-68.220.241.50:48190.service: Deactivated successfully. Jan 14 01:16:34.739800 systemd[1]: session-3.scope: Deactivated successfully. Jan 14 01:16:34.741388 systemd-logind[1657]: Removed session 3. Jan 14 01:16:34.879012 systemd[1]: Started sshd@2-77.42.79.167:22-68.220.241.50:48194.service - OpenSSH per-connection server daemon (68.220.241.50:48194). Jan 14 01:16:35.562073 sshd[1835]: Accepted publickey for core from 68.220.241.50 port 48194 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:16:35.565181 sshd-session[1835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:35.575216 systemd-logind[1657]: New session 4 of user core. Jan 14 01:16:35.581439 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 14 01:16:35.929801 sshd[1839]: Connection closed by 68.220.241.50 port 48194 Jan 14 01:16:35.930774 sshd-session[1835]: pam_unix(sshd:session): session closed for user core Jan 14 01:16:35.937880 systemd[1]: sshd@2-77.42.79.167:22-68.220.241.50:48194.service: Deactivated successfully. Jan 14 01:16:35.942570 systemd[1]: session-4.scope: Deactivated successfully. Jan 14 01:16:35.946840 systemd-logind[1657]: Session 4 logged out. Waiting for processes to exit. Jan 14 01:16:35.948402 systemd-logind[1657]: Removed session 4. Jan 14 01:16:36.069572 systemd[1]: Started sshd@3-77.42.79.167:22-68.220.241.50:48208.service - OpenSSH per-connection server daemon (68.220.241.50:48208). Jan 14 01:16:36.730956 sshd[1845]: Accepted publickey for core from 68.220.241.50 port 48208 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:16:36.734241 sshd-session[1845]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:36.742840 systemd-logind[1657]: New session 5 of user core. Jan 14 01:16:36.754596 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 14 01:16:37.102590 sshd[1849]: Connection closed by 68.220.241.50 port 48208 Jan 14 01:16:37.103194 sshd-session[1845]: pam_unix(sshd:session): session closed for user core Jan 14 01:16:37.108536 systemd-logind[1657]: Session 5 logged out. Waiting for processes to exit. Jan 14 01:16:37.109402 systemd[1]: sshd@3-77.42.79.167:22-68.220.241.50:48208.service: Deactivated successfully. Jan 14 01:16:37.111673 systemd[1]: session-5.scope: Deactivated successfully. Jan 14 01:16:37.113091 systemd-logind[1657]: Removed session 5. Jan 14 01:16:37.242513 systemd[1]: Started sshd@4-77.42.79.167:22-68.220.241.50:48210.service - OpenSSH per-connection server daemon (68.220.241.50:48210). Jan 14 01:16:37.927722 sshd[1855]: Accepted publickey for core from 68.220.241.50 port 48210 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:16:37.929177 sshd-session[1855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:37.937166 systemd-logind[1657]: New session 6 of user core. Jan 14 01:16:37.946389 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 14 01:16:38.189925 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 14 01:16:38.190469 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:16:38.205061 sudo[1860]: pam_unix(sudo:session): session closed for user root Jan 14 01:16:38.327077 sshd[1859]: Connection closed by 68.220.241.50 port 48210 Jan 14 01:16:38.327992 sshd-session[1855]: pam_unix(sshd:session): session closed for user core Jan 14 01:16:38.333202 systemd[1]: sshd@4-77.42.79.167:22-68.220.241.50:48210.service: Deactivated successfully. Jan 14 01:16:38.337005 systemd[1]: session-6.scope: Deactivated successfully. Jan 14 01:16:38.340765 systemd-logind[1657]: Session 6 logged out. Waiting for processes to exit. Jan 14 01:16:38.342572 systemd-logind[1657]: Removed session 6. Jan 14 01:16:38.459655 systemd[1]: Started sshd@5-77.42.79.167:22-68.220.241.50:48216.service - OpenSSH per-connection server daemon (68.220.241.50:48216). Jan 14 01:16:39.134647 sshd[1867]: Accepted publickey for core from 68.220.241.50 port 48216 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:16:39.136602 sshd-session[1867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:39.142521 systemd-logind[1657]: New session 7 of user core. Jan 14 01:16:39.149386 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 14 01:16:39.391033 sudo[1873]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 14 01:16:39.391774 sudo[1873]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:16:39.396064 sudo[1873]: pam_unix(sudo:session): session closed for user root Jan 14 01:16:39.405378 sudo[1872]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 14 01:16:39.405823 sudo[1872]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:16:39.417917 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 01:16:39.477148 kernel: kauditd_printk_skb: 42 callbacks suppressed Jan 14 01:16:39.477276 kernel: audit: type=1305 audit(1768353399.475:236): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:16:39.475000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:16:39.477397 augenrules[1897]: No rules Jan 14 01:16:39.475000 audit[1897]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffdea74bd00 a2=420 a3=0 items=0 ppid=1878 pid=1897 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:39.480334 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 01:16:39.480930 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 01:16:39.486080 kernel: audit: type=1300 audit(1768353399.475:236): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffdea74bd00 a2=420 a3=0 items=0 ppid=1878 pid=1897 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:39.486266 kernel: audit: type=1327 audit(1768353399.475:236): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:16:39.475000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:16:39.485032 sudo[1872]: pam_unix(sudo:session): session closed for user root Jan 14 01:16:39.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.491234 kernel: audit: type=1130 audit(1768353399.479:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.491312 kernel: audit: type=1131 audit(1768353399.479:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.479000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.479000 audit[1872]: USER_END pid=1872 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.496115 kernel: audit: type=1106 audit(1768353399.479:239): pid=1872 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.485000 audit[1872]: CRED_DISP pid=1872 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.504051 kernel: audit: type=1104 audit(1768353399.485:240): pid=1872 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.610137 sshd[1871]: Connection closed by 68.220.241.50 port 48216 Jan 14 01:16:39.608905 sshd-session[1867]: pam_unix(sshd:session): session closed for user core Jan 14 01:16:39.611000 audit[1867]: USER_END pid=1867 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:39.617920 systemd[1]: sshd@5-77.42.79.167:22-68.220.241.50:48216.service: Deactivated successfully. Jan 14 01:16:39.621940 systemd[1]: session-7.scope: Deactivated successfully. Jan 14 01:16:39.611000 audit[1867]: CRED_DISP pid=1867 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:39.627077 systemd-logind[1657]: Session 7 logged out. Waiting for processes to exit. Jan 14 01:16:39.628906 systemd-logind[1657]: Removed session 7. Jan 14 01:16:39.635996 kernel: audit: type=1106 audit(1768353399.611:241): pid=1867 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:39.636075 kernel: audit: type=1104 audit(1768353399.611:242): pid=1867 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:39.636143 kernel: audit: type=1131 audit(1768353399.618:243): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-77.42.79.167:22-68.220.241.50:48216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.618000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-77.42.79.167:22-68.220.241.50:48216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.758000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.79.167:22-68.220.241.50:48232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:39.758045 systemd[1]: Started sshd@6-77.42.79.167:22-68.220.241.50:48232.service - OpenSSH per-connection server daemon (68.220.241.50:48232). Jan 14 01:16:40.438000 audit[1906]: USER_ACCT pid=1906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:40.439061 sshd[1906]: Accepted publickey for core from 68.220.241.50 port 48232 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:16:40.440000 audit[1906]: CRED_ACQ pid=1906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:40.440000 audit[1906]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd236d90f0 a2=3 a3=0 items=0 ppid=1 pid=1906 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:40.440000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:16:40.442159 sshd-session[1906]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:16:40.453197 systemd-logind[1657]: New session 8 of user core. Jan 14 01:16:40.463476 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 14 01:16:40.470000 audit[1906]: USER_START pid=1906 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:40.474000 audit[1910]: CRED_ACQ pid=1910 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:16:40.696000 audit[1911]: USER_ACCT pid=1911 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:16:40.696726 sudo[1911]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 14 01:16:40.696000 audit[1911]: CRED_REFR pid=1911 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:16:40.697674 sudo[1911]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:16:40.697000 audit[1911]: USER_START pid=1911 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:16:40.732671 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 14 01:16:40.740444 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:16:40.944000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:40.944809 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:16:40.953430 (kubelet)[1934]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:16:40.999143 kubelet[1934]: E0114 01:16:40.999003 1934 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:16:41.007000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:16:41.006600 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:16:41.006763 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:16:41.007416 systemd[1]: kubelet.service: Consumed 191ms CPU time, 109.1M memory peak. Jan 14 01:16:41.131881 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 14 01:16:41.154648 (dockerd)[1945]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 14 01:16:41.431300 dockerd[1945]: time="2026-01-14T01:16:41.430706997Z" level=info msg="Starting up" Jan 14 01:16:41.432507 dockerd[1945]: time="2026-01-14T01:16:41.432192107Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 14 01:16:41.446212 dockerd[1945]: time="2026-01-14T01:16:41.446151833Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 14 01:16:41.494423 dockerd[1945]: time="2026-01-14T01:16:41.494373033Z" level=info msg="Loading containers: start." Jan 14 01:16:41.507238 kernel: Initializing XFRM netlink socket Jan 14 01:16:41.573000 audit[1993]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1993 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.573000 audit[1993]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd162bcf10 a2=0 a3=0 items=0 ppid=1945 pid=1993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.573000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:16:41.576000 audit[1995]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1995 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.576000 audit[1995]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe375c03b0 a2=0 a3=0 items=0 ppid=1945 pid=1995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.576000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:16:41.578000 audit[1997]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1997 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.578000 audit[1997]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd59c27850 a2=0 a3=0 items=0 ppid=1945 pid=1997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.578000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:16:41.581000 audit[1999]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1999 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.581000 audit[1999]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe0ed4eaa0 a2=0 a3=0 items=0 ppid=1945 pid=1999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.581000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:16:41.583000 audit[2001]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2001 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.583000 audit[2001]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe2f3f8f50 a2=0 a3=0 items=0 ppid=1945 pid=2001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.583000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:16:41.586000 audit[2003]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2003 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.586000 audit[2003]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe65a82c30 a2=0 a3=0 items=0 ppid=1945 pid=2003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.586000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:16:41.592000 audit[2005]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2005 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.592000 audit[2005]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe4bdccb60 a2=0 a3=0 items=0 ppid=1945 pid=2005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.592000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:16:41.595000 audit[2007]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2007 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.595000 audit[2007]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7fffa26cd6b0 a2=0 a3=0 items=0 ppid=1945 pid=2007 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.595000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:16:41.630000 audit[2010]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2010 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.630000 audit[2010]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7fff03c55f40 a2=0 a3=0 items=0 ppid=1945 pid=2010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.630000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 14 01:16:41.633000 audit[2012]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2012 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.633000 audit[2012]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd07aeea10 a2=0 a3=0 items=0 ppid=1945 pid=2012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.633000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:16:41.636000 audit[2014]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2014 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.636000 audit[2014]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd97fa2eb0 a2=0 a3=0 items=0 ppid=1945 pid=2014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.636000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:16:41.638000 audit[2016]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2016 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.638000 audit[2016]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffff9ed93e0 a2=0 a3=0 items=0 ppid=1945 pid=2016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.638000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:16:41.641000 audit[2018]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2018 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.641000 audit[2018]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffefd0d25e0 a2=0 a3=0 items=0 ppid=1945 pid=2018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.641000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:16:41.685000 audit[2048]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.685000 audit[2048]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7fff2c905370 a2=0 a3=0 items=0 ppid=1945 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.685000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:16:41.687000 audit[2050]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2050 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.687000 audit[2050]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffd135e9470 a2=0 a3=0 items=0 ppid=1945 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.687000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:16:41.689000 audit[2052]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2052 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.689000 audit[2052]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff828f09a0 a2=0 a3=0 items=0 ppid=1945 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.689000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:16:41.692000 audit[2054]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2054 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.692000 audit[2054]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd354d1ca0 a2=0 a3=0 items=0 ppid=1945 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.692000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:16:41.694000 audit[2056]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2056 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.694000 audit[2056]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffdff6ad660 a2=0 a3=0 items=0 ppid=1945 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.694000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:16:41.697000 audit[2058]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2058 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.697000 audit[2058]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd847e9770 a2=0 a3=0 items=0 ppid=1945 pid=2058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.697000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:16:41.699000 audit[2060]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2060 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.699000 audit[2060]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffca582f6b0 a2=0 a3=0 items=0 ppid=1945 pid=2060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.699000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:16:41.702000 audit[2062]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2062 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.702000 audit[2062]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffd9c13a7d0 a2=0 a3=0 items=0 ppid=1945 pid=2062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.702000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:16:41.705000 audit[2064]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2064 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.705000 audit[2064]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffeae3da4b0 a2=0 a3=0 items=0 ppid=1945 pid=2064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.705000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 14 01:16:41.708000 audit[2066]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2066 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.708000 audit[2066]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe02170730 a2=0 a3=0 items=0 ppid=1945 pid=2066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.708000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:16:41.710000 audit[2068]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2068 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.710000 audit[2068]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd9e042470 a2=0 a3=0 items=0 ppid=1945 pid=2068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.710000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:16:41.713000 audit[2070]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.713000 audit[2070]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffe22dea1d0 a2=0 a3=0 items=0 ppid=1945 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.713000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:16:41.715000 audit[2072]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.715000 audit[2072]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffe4c13a7a0 a2=0 a3=0 items=0 ppid=1945 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.715000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:16:41.722000 audit[2077]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2077 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.722000 audit[2077]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe43537fd0 a2=0 a3=0 items=0 ppid=1945 pid=2077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.722000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:16:41.725000 audit[2079]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2079 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.725000 audit[2079]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc8f058410 a2=0 a3=0 items=0 ppid=1945 pid=2079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.725000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:16:41.727000 audit[2081]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.727000 audit[2081]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffcce690090 a2=0 a3=0 items=0 ppid=1945 pid=2081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.727000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:16:41.730000 audit[2083]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2083 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.730000 audit[2083]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd0ae574a0 a2=0 a3=0 items=0 ppid=1945 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.730000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:16:41.733000 audit[2085]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2085 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.733000 audit[2085]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd4b1984c0 a2=0 a3=0 items=0 ppid=1945 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.733000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:16:41.735000 audit[2087]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:41.735000 audit[2087]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd01c78c80 a2=0 a3=0 items=0 ppid=1945 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.735000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:16:41.764974 systemd-timesyncd[1583]: Network configuration changed, trying to establish connection. Jan 14 01:16:41.782000 audit[2092]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.782000 audit[2092]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffe22be06e0 a2=0 a3=0 items=0 ppid=1945 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.782000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 14 01:16:41.788000 audit[2095]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2095 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:41.788000 audit[2095]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffd7cc671e0 a2=0 a3=0 items=0 ppid=1945 pid=2095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:41.788000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 14 01:16:43.203872 systemd-resolved[1335]: Clock change detected. Flushing caches. Jan 14 01:16:43.204280 systemd-timesyncd[1583]: Contacted time server 185.228.138.224:123 (2.flatcar.pool.ntp.org). Jan 14 01:16:43.204349 systemd-timesyncd[1583]: Initial clock synchronization to Wed 2026-01-14 01:16:43.202053 UTC. Jan 14 01:16:43.204000 audit[2103]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2103 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:43.204000 audit[2103]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffdb62368d0 a2=0 a3=0 items=0 ppid=1945 pid=2103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:43.204000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 14 01:16:43.217000 audit[2109]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:43.217000 audit[2109]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffc581f4380 a2=0 a3=0 items=0 ppid=1945 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:43.217000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 14 01:16:43.220000 audit[2111]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:43.220000 audit[2111]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffd35760940 a2=0 a3=0 items=0 ppid=1945 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:43.220000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 14 01:16:43.222000 audit[2113]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:43.222000 audit[2113]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe453f9320 a2=0 a3=0 items=0 ppid=1945 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:43.222000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 14 01:16:43.225000 audit[2115]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:43.225000 audit[2115]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffc4fa20100 a2=0 a3=0 items=0 ppid=1945 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:43.225000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:16:43.227000 audit[2117]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:43.227000 audit[2117]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd8587cf50 a2=0 a3=0 items=0 ppid=1945 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:43.227000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 14 01:16:43.229532 systemd-networkd[1572]: docker0: Link UP Jan 14 01:16:43.235175 dockerd[1945]: time="2026-01-14T01:16:43.235118370Z" level=info msg="Loading containers: done." Jan 14 01:16:43.254465 dockerd[1945]: time="2026-01-14T01:16:43.254371428Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 14 01:16:43.254645 dockerd[1945]: time="2026-01-14T01:16:43.254520538Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 14 01:16:43.254645 dockerd[1945]: time="2026-01-14T01:16:43.254636008Z" level=info msg="Initializing buildkit" Jan 14 01:16:43.279708 dockerd[1945]: time="2026-01-14T01:16:43.279491568Z" level=info msg="Completed buildkit initialization" Jan 14 01:16:43.284612 dockerd[1945]: time="2026-01-14T01:16:43.284577370Z" level=info msg="Daemon has completed initialization" Jan 14 01:16:43.284841 dockerd[1945]: time="2026-01-14T01:16:43.284781550Z" level=info msg="API listen on /run/docker.sock" Jan 14 01:16:43.285366 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 14 01:16:43.284000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:44.392390 containerd[1677]: time="2026-01-14T01:16:44.392083012Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 14 01:16:44.973923 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3927713628.mount: Deactivated successfully. Jan 14 01:16:46.048249 containerd[1677]: time="2026-01-14T01:16:46.048189581Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:46.049303 containerd[1677]: time="2026-01-14T01:16:46.049134232Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=28445968" Jan 14 01:16:46.050148 containerd[1677]: time="2026-01-14T01:16:46.050123752Z" level=info msg="ImageCreate event name:\"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:46.052207 containerd[1677]: time="2026-01-14T01:16:46.052170913Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:46.052688 containerd[1677]: time="2026-01-14T01:16:46.052655813Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"30111311\" in 1.660523151s" Jan 14 01:16:46.052734 containerd[1677]: time="2026-01-14T01:16:46.052689923Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\"" Jan 14 01:16:46.053167 containerd[1677]: time="2026-01-14T01:16:46.053145243Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 14 01:16:47.697638 containerd[1677]: time="2026-01-14T01:16:47.697588738Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:47.698706 containerd[1677]: time="2026-01-14T01:16:47.698591269Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=26008626" Jan 14 01:16:47.699583 containerd[1677]: time="2026-01-14T01:16:47.699565549Z" level=info msg="ImageCreate event name:\"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:47.701805 containerd[1677]: time="2026-01-14T01:16:47.701781780Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:47.702319 containerd[1677]: time="2026-01-14T01:16:47.702303300Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"27673815\" in 1.649135257s" Jan 14 01:16:47.702445 containerd[1677]: time="2026-01-14T01:16:47.702376400Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\"" Jan 14 01:16:47.702912 containerd[1677]: time="2026-01-14T01:16:47.702891380Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 14 01:16:48.947532 containerd[1677]: time="2026-01-14T01:16:48.947060109Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:48.948159 containerd[1677]: time="2026-01-14T01:16:48.948128919Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=20149965" Jan 14 01:16:48.948942 containerd[1677]: time="2026-01-14T01:16:48.948911219Z" level=info msg="ImageCreate event name:\"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:48.951516 containerd[1677]: time="2026-01-14T01:16:48.951476450Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:48.952326 containerd[1677]: time="2026-01-14T01:16:48.951935771Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"21815154\" in 1.248985501s" Jan 14 01:16:48.952326 containerd[1677]: time="2026-01-14T01:16:48.951958691Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\"" Jan 14 01:16:48.952412 containerd[1677]: time="2026-01-14T01:16:48.952394581Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 14 01:16:50.153451 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3489201535.mount: Deactivated successfully. Jan 14 01:16:50.467621 containerd[1677]: time="2026-01-14T01:16:50.467449342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:50.468953 containerd[1677]: time="2026-01-14T01:16:50.468756622Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=31926374" Jan 14 01:16:50.469933 containerd[1677]: time="2026-01-14T01:16:50.469905413Z" level=info msg="ImageCreate event name:\"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:50.472219 containerd[1677]: time="2026-01-14T01:16:50.472184674Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:50.472874 containerd[1677]: time="2026-01-14T01:16:50.472838074Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"31929115\" in 1.520420943s" Jan 14 01:16:50.473064 containerd[1677]: time="2026-01-14T01:16:50.472939154Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\"" Jan 14 01:16:50.473525 containerd[1677]: time="2026-01-14T01:16:50.473483654Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 14 01:16:51.268163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3008340231.mount: Deactivated successfully. Jan 14 01:16:52.085143 containerd[1677]: time="2026-01-14T01:16:52.085074006Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:52.086417 containerd[1677]: time="2026-01-14T01:16:52.086179986Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20128467" Jan 14 01:16:52.087333 containerd[1677]: time="2026-01-14T01:16:52.087294336Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:52.089773 containerd[1677]: time="2026-01-14T01:16:52.089734757Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:52.090808 containerd[1677]: time="2026-01-14T01:16:52.090775788Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.617150354s" Jan 14 01:16:52.090808 containerd[1677]: time="2026-01-14T01:16:52.090807318Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Jan 14 01:16:52.091382 containerd[1677]: time="2026-01-14T01:16:52.091346448Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 14 01:16:52.575063 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 14 01:16:52.579366 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:16:52.596395 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3646088177.mount: Deactivated successfully. Jan 14 01:16:52.607548 containerd[1677]: time="2026-01-14T01:16:52.607456243Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:16:52.610053 containerd[1677]: time="2026-01-14T01:16:52.609290154Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 14 01:16:52.610490 containerd[1677]: time="2026-01-14T01:16:52.610435714Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:16:52.614812 containerd[1677]: time="2026-01-14T01:16:52.614759996Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:16:52.615764 containerd[1677]: time="2026-01-14T01:16:52.615707517Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 524.323459ms" Jan 14 01:16:52.615764 containerd[1677]: time="2026-01-14T01:16:52.615759747Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 14 01:16:52.616490 containerd[1677]: time="2026-01-14T01:16:52.616449737Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 14 01:16:52.749254 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:16:52.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:52.751539 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 14 01:16:52.751584 kernel: audit: type=1130 audit(1768353412.749:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:52.774607 (kubelet)[2295]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:16:52.814065 kubelet[2295]: E0114 01:16:52.813985 2295 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:16:52.818098 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:16:52.818268 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:16:52.818000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:16:52.819495 systemd[1]: kubelet.service: Consumed 182ms CPU time, 108.3M memory peak. Jan 14 01:16:52.824163 kernel: audit: type=1131 audit(1768353412.818:297): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:16:53.350341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount775046420.mount: Deactivated successfully. Jan 14 01:16:54.900567 containerd[1677]: time="2026-01-14T01:16:54.900487598Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:54.901738 containerd[1677]: time="2026-01-14T01:16:54.901570019Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=56977150" Jan 14 01:16:54.902580 containerd[1677]: time="2026-01-14T01:16:54.902547539Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:54.904961 containerd[1677]: time="2026-01-14T01:16:54.904933320Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:16:54.905952 containerd[1677]: time="2026-01-14T01:16:54.905905810Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.289415973s" Jan 14 01:16:54.905952 containerd[1677]: time="2026-01-14T01:16:54.905931790Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Jan 14 01:16:58.538330 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:16:58.537000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:58.538631 systemd[1]: kubelet.service: Consumed 182ms CPU time, 108.3M memory peak. Jan 14 01:16:58.548753 kernel: audit: type=1130 audit(1768353418.537:298): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:58.548910 kernel: audit: type=1131 audit(1768353418.537:299): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:58.537000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:58.546183 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:16:58.572321 systemd[1]: Reload requested from client PID 2385 ('systemctl') (unit session-8.scope)... Jan 14 01:16:58.572338 systemd[1]: Reloading... Jan 14 01:16:58.696040 zram_generator::config[2432]: No configuration found. Jan 14 01:16:58.953719 systemd[1]: Reloading finished in 380 ms. Jan 14 01:16:58.987782 kernel: audit: type=1334 audit(1768353418.978:300): prog-id=63 op=LOAD Jan 14 01:16:58.978000 audit: BPF prog-id=63 op=LOAD Jan 14 01:16:58.978000 audit: BPF prog-id=43 op=UNLOAD Jan 14 01:16:58.995257 kernel: audit: type=1334 audit(1768353418.978:301): prog-id=43 op=UNLOAD Jan 14 01:16:58.978000 audit: BPF prog-id=64 op=LOAD Jan 14 01:16:58.978000 audit: BPF prog-id=65 op=LOAD Jan 14 01:16:59.002044 kernel: audit: type=1334 audit(1768353418.978:302): prog-id=64 op=LOAD Jan 14 01:16:59.002123 kernel: audit: type=1334 audit(1768353418.978:303): prog-id=65 op=LOAD Jan 14 01:16:58.978000 audit: BPF prog-id=44 op=UNLOAD Jan 14 01:16:58.978000 audit: BPF prog-id=45 op=UNLOAD Jan 14 01:16:59.015032 kernel: audit: type=1334 audit(1768353418.978:304): prog-id=44 op=UNLOAD Jan 14 01:16:59.015122 kernel: audit: type=1334 audit(1768353418.978:305): prog-id=45 op=UNLOAD Jan 14 01:16:58.980000 audit: BPF prog-id=66 op=LOAD Jan 14 01:16:59.018518 kernel: audit: type=1334 audit(1768353418.980:306): prog-id=66 op=LOAD Jan 14 01:16:58.980000 audit: BPF prog-id=60 op=UNLOAD Jan 14 01:16:58.980000 audit: BPF prog-id=67 op=LOAD Jan 14 01:16:58.980000 audit: BPF prog-id=68 op=LOAD Jan 14 01:16:58.980000 audit: BPF prog-id=61 op=UNLOAD Jan 14 01:16:58.980000 audit: BPF prog-id=62 op=UNLOAD Jan 14 01:16:58.984000 audit: BPF prog-id=69 op=LOAD Jan 14 01:16:58.984000 audit: BPF prog-id=50 op=UNLOAD Jan 14 01:16:58.984000 audit: BPF prog-id=70 op=LOAD Jan 14 01:16:58.984000 audit: BPF prog-id=71 op=LOAD Jan 14 01:16:58.984000 audit: BPF prog-id=51 op=UNLOAD Jan 14 01:16:58.984000 audit: BPF prog-id=52 op=UNLOAD Jan 14 01:16:59.027125 kernel: audit: type=1334 audit(1768353418.980:307): prog-id=60 op=UNLOAD Jan 14 01:16:58.988000 audit: BPF prog-id=72 op=LOAD Jan 14 01:16:58.988000 audit: BPF prog-id=46 op=UNLOAD Jan 14 01:16:58.988000 audit: BPF prog-id=73 op=LOAD Jan 14 01:16:58.988000 audit: BPF prog-id=74 op=LOAD Jan 14 01:16:58.988000 audit: BPF prog-id=47 op=UNLOAD Jan 14 01:16:58.988000 audit: BPF prog-id=48 op=UNLOAD Jan 14 01:16:58.989000 audit: BPF prog-id=75 op=LOAD Jan 14 01:16:58.989000 audit: BPF prog-id=53 op=UNLOAD Jan 14 01:16:58.989000 audit: BPF prog-id=76 op=LOAD Jan 14 01:16:58.989000 audit: BPF prog-id=77 op=LOAD Jan 14 01:16:58.989000 audit: BPF prog-id=54 op=UNLOAD Jan 14 01:16:58.989000 audit: BPF prog-id=55 op=UNLOAD Jan 14 01:16:58.990000 audit: BPF prog-id=78 op=LOAD Jan 14 01:16:58.999000 audit: BPF prog-id=58 op=UNLOAD Jan 14 01:16:59.001000 audit: BPF prog-id=79 op=LOAD Jan 14 01:16:59.001000 audit: BPF prog-id=80 op=LOAD Jan 14 01:16:59.001000 audit: BPF prog-id=56 op=UNLOAD Jan 14 01:16:59.001000 audit: BPF prog-id=57 op=UNLOAD Jan 14 01:16:59.004000 audit: BPF prog-id=81 op=LOAD Jan 14 01:16:59.004000 audit: BPF prog-id=49 op=UNLOAD Jan 14 01:16:59.005000 audit: BPF prog-id=82 op=LOAD Jan 14 01:16:59.005000 audit: BPF prog-id=59 op=UNLOAD Jan 14 01:16:59.035071 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 14 01:16:59.035281 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 14 01:16:59.036070 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:16:59.036177 systemd[1]: kubelet.service: Consumed 141ms CPU time, 98.5M memory peak. Jan 14 01:16:59.035000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:16:59.039701 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:16:59.246239 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:16:59.245000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:16:59.257422 (kubelet)[2486]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:16:59.292720 kubelet[2486]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:16:59.292720 kubelet[2486]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:16:59.292720 kubelet[2486]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:16:59.293093 kubelet[2486]: I0114 01:16:59.292986 2486 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:16:59.448322 kubelet[2486]: I0114 01:16:59.448278 2486 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 14 01:16:59.448322 kubelet[2486]: I0114 01:16:59.448305 2486 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:16:59.448613 kubelet[2486]: I0114 01:16:59.448577 2486 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 01:16:59.470775 kubelet[2486]: E0114 01:16:59.470728 2486 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://77.42.79.167:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 77.42.79.167:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 14 01:16:59.470892 kubelet[2486]: I0114 01:16:59.470879 2486 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:16:59.482777 kubelet[2486]: I0114 01:16:59.482734 2486 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:16:59.486517 kubelet[2486]: I0114 01:16:59.486481 2486 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 01:16:59.486819 kubelet[2486]: I0114 01:16:59.486775 2486 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:16:59.486989 kubelet[2486]: I0114 01:16:59.486809 2486 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4578-0-0-p-2c3a114250","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:16:59.486989 kubelet[2486]: I0114 01:16:59.486987 2486 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:16:59.487139 kubelet[2486]: I0114 01:16:59.486997 2486 container_manager_linux.go:303] "Creating device plugin manager" Jan 14 01:16:59.487164 kubelet[2486]: I0114 01:16:59.487158 2486 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:16:59.490237 kubelet[2486]: I0114 01:16:59.490190 2486 kubelet.go:480] "Attempting to sync node with API server" Jan 14 01:16:59.490237 kubelet[2486]: I0114 01:16:59.490217 2486 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:16:59.490237 kubelet[2486]: I0114 01:16:59.490243 2486 kubelet.go:386] "Adding apiserver pod source" Jan 14 01:16:59.490647 kubelet[2486]: I0114 01:16:59.490257 2486 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:16:59.495659 kubelet[2486]: E0114 01:16:59.494934 2486 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.79.167:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4578-0-0-p-2c3a114250&limit=500&resourceVersion=0\": dial tcp 77.42.79.167:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 14 01:16:59.495659 kubelet[2486]: E0114 01:16:59.495309 2486 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://77.42.79.167:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.79.167:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 14 01:16:59.496015 kubelet[2486]: I0114 01:16:59.495970 2486 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:16:59.496805 kubelet[2486]: I0114 01:16:59.496482 2486 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 01:16:59.497709 kubelet[2486]: W0114 01:16:59.497381 2486 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 14 01:16:59.500625 kubelet[2486]: I0114 01:16:59.500598 2486 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 01:16:59.500686 kubelet[2486]: I0114 01:16:59.500652 2486 server.go:1289] "Started kubelet" Jan 14 01:16:59.503115 kubelet[2486]: I0114 01:16:59.503069 2486 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:16:59.504127 kubelet[2486]: I0114 01:16:59.503989 2486 server.go:317] "Adding debug handlers to kubelet server" Jan 14 01:16:59.505998 kubelet[2486]: I0114 01:16:59.505949 2486 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:16:59.506550 kubelet[2486]: I0114 01:16:59.506526 2486 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:16:59.509028 kubelet[2486]: E0114 01:16:59.506745 2486 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://77.42.79.167:6443/api/v1/namespaces/default/events\": dial tcp 77.42.79.167:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4578-0-0-p-2c3a114250.188a7410deae8da8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4578-0-0-p-2c3a114250,UID:ci-4578-0-0-p-2c3a114250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4578-0-0-p-2c3a114250,},FirstTimestamp:2026-01-14 01:16:59.500621224 +0000 UTC m=+0.239314091,LastTimestamp:2026-01-14 01:16:59.500621224 +0000 UTC m=+0.239314091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4578-0-0-p-2c3a114250,}" Jan 14 01:16:59.509028 kubelet[2486]: I0114 01:16:59.508893 2486 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:16:59.509167 kubelet[2486]: I0114 01:16:59.509093 2486 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:16:59.513000 audit[2501]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2501 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.515107 kubelet[2486]: E0114 01:16:59.515038 2486 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 14 01:16:59.513000 audit[2501]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd1bb1b830 a2=0 a3=0 items=0 ppid=2486 pid=2501 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.513000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:16:59.515332 kubelet[2486]: E0114 01:16:59.515279 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:16:59.515332 kubelet[2486]: I0114 01:16:59.515330 2486 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 01:16:59.515550 kubelet[2486]: I0114 01:16:59.515517 2486 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 01:16:59.515601 kubelet[2486]: I0114 01:16:59.515586 2486 reconciler.go:26] "Reconciler: start to sync state" Jan 14 01:16:59.516289 kubelet[2486]: I0114 01:16:59.516271 2486 factory.go:223] Registration of the systemd container factory successfully Jan 14 01:16:59.516357 kubelet[2486]: I0114 01:16:59.516338 2486 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:16:59.516625 kubelet[2486]: E0114 01:16:59.516601 2486 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://77.42.79.167:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.79.167:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 14 01:16:59.515000 audit[2502]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2502 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.515000 audit[2502]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe468dabf0 a2=0 a3=0 items=0 ppid=2486 pid=2502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.515000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:16:59.517639 kubelet[2486]: E0114 01:16:59.517593 2486 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.79.167:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-2c3a114250?timeout=10s\": dial tcp 77.42.79.167:6443: connect: connection refused" interval="200ms" Jan 14 01:16:59.517869 kubelet[2486]: I0114 01:16:59.517853 2486 factory.go:223] Registration of the containerd container factory successfully Jan 14 01:16:59.518000 audit[2504]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2504 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.518000 audit[2504]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe878a4a30 a2=0 a3=0 items=0 ppid=2486 pid=2504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.518000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:16:59.521000 audit[2506]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2506 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.521000 audit[2506]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffdcff65ef0 a2=0 a3=0 items=0 ppid=2486 pid=2506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.521000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:16:59.529000 audit[2509]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2509 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.529000 audit[2509]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffeb76a3b50 a2=0 a3=0 items=0 ppid=2486 pid=2509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.529000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 14 01:16:59.532252 kubelet[2486]: I0114 01:16:59.532224 2486 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 14 01:16:59.532000 audit[2511]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2511 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:59.532000 audit[2511]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffc5de70660 a2=0 a3=0 items=0 ppid=2486 pid=2511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.532000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:16:59.533807 kubelet[2486]: I0114 01:16:59.533779 2486 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 14 01:16:59.533849 kubelet[2486]: I0114 01:16:59.533825 2486 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 14 01:16:59.533867 kubelet[2486]: I0114 01:16:59.533848 2486 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:16:59.533867 kubelet[2486]: I0114 01:16:59.533856 2486 kubelet.go:2436] "Starting kubelet main sync loop" Jan 14 01:16:59.533935 kubelet[2486]: E0114 01:16:59.533908 2486 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:16:59.533000 audit[2512]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2512 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.533000 audit[2512]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe8a6ec590 a2=0 a3=0 items=0 ppid=2486 pid=2512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.533000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:16:59.536000 audit[2514]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2514 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:59.536000 audit[2514]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc210d30c0 a2=0 a3=0 items=0 ppid=2486 pid=2514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.536000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:16:59.537000 audit[2515]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2515 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:59.537000 audit[2515]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeaed5e390 a2=0 a3=0 items=0 ppid=2486 pid=2515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.537000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:16:59.539000 audit[2516]: NETFILTER_CFG table=filter:51 family=10 entries=1 op=nft_register_chain pid=2516 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:16:59.539000 audit[2516]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcea7e21c0 a2=0 a3=0 items=0 ppid=2486 pid=2516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.539000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:16:59.542000 audit[2517]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2517 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.542000 audit[2517]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe0d757220 a2=0 a3=0 items=0 ppid=2486 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.542000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:16:59.544000 audit[2518]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2518 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:16:59.544000 audit[2518]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffedf9065c0 a2=0 a3=0 items=0 ppid=2486 pid=2518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:16:59.544000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:16:59.546321 kubelet[2486]: E0114 01:16:59.546286 2486 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://77.42.79.167:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.79.167:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 14 01:16:59.547532 kubelet[2486]: I0114 01:16:59.547512 2486 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:16:59.547532 kubelet[2486]: I0114 01:16:59.547528 2486 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:16:59.547617 kubelet[2486]: I0114 01:16:59.547555 2486 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:16:59.577300 kubelet[2486]: I0114 01:16:59.577251 2486 policy_none.go:49] "None policy: Start" Jan 14 01:16:59.577300 kubelet[2486]: I0114 01:16:59.577290 2486 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 01:16:59.577300 kubelet[2486]: I0114 01:16:59.577311 2486 state_mem.go:35] "Initializing new in-memory state store" Jan 14 01:16:59.611118 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 14 01:16:59.616317 kubelet[2486]: E0114 01:16:59.616276 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:16:59.624336 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 14 01:16:59.629947 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 14 01:16:59.634595 kubelet[2486]: E0114 01:16:59.634533 2486 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 14 01:16:59.643071 kubelet[2486]: E0114 01:16:59.642696 2486 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 01:16:59.643071 kubelet[2486]: I0114 01:16:59.642932 2486 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:16:59.643071 kubelet[2486]: I0114 01:16:59.642946 2486 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:16:59.643626 kubelet[2486]: I0114 01:16:59.643611 2486 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:16:59.645491 kubelet[2486]: E0114 01:16:59.645459 2486 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:16:59.645563 kubelet[2486]: E0114 01:16:59.645511 2486 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:16:59.718491 kubelet[2486]: E0114 01:16:59.718392 2486 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.79.167:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-2c3a114250?timeout=10s\": dial tcp 77.42.79.167:6443: connect: connection refused" interval="400ms" Jan 14 01:16:59.745749 kubelet[2486]: I0114 01:16:59.745663 2486 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.746121 kubelet[2486]: E0114 01:16:59.745982 2486 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.79.167:6443/api/v1/nodes\": dial tcp 77.42.79.167:6443: connect: connection refused" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.853040 systemd[1]: Created slice kubepods-burstable-podf7ad8f0024e49c19b755cfb9d0ad158e.slice - libcontainer container kubepods-burstable-podf7ad8f0024e49c19b755cfb9d0ad158e.slice. Jan 14 01:16:59.872961 kubelet[2486]: E0114 01:16:59.872907 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.877152 systemd[1]: Created slice kubepods-burstable-pod6c8218ba496510a55706811a8336a2c6.slice - libcontainer container kubepods-burstable-pod6c8218ba496510a55706811a8336a2c6.slice. Jan 14 01:16:59.889355 kubelet[2486]: E0114 01:16:59.889128 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.893448 systemd[1]: Created slice kubepods-burstable-podd754af2259c2de388dd3ff2a3cc6c6ba.slice - libcontainer container kubepods-burstable-podd754af2259c2de388dd3ff2a3cc6c6ba.slice. Jan 14 01:16:59.897094 kubelet[2486]: E0114 01:16:59.897046 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917581 kubelet[2486]: I0114 01:16:59.917508 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-ca-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917581 kubelet[2486]: I0114 01:16:59.917566 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-k8s-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917581 kubelet[2486]: I0114 01:16:59.917588 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-kubeconfig\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917809 kubelet[2486]: I0114 01:16:59.917606 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f7ad8f0024e49c19b755cfb9d0ad158e-ca-certs\") pod \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" (UID: \"f7ad8f0024e49c19b755cfb9d0ad158e\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917809 kubelet[2486]: I0114 01:16:59.917625 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f7ad8f0024e49c19b755cfb9d0ad158e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" (UID: \"f7ad8f0024e49c19b755cfb9d0ad158e\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917809 kubelet[2486]: I0114 01:16:59.917642 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-flexvolume-dir\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917809 kubelet[2486]: I0114 01:16:59.917658 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917809 kubelet[2486]: I0114 01:16:59.917685 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d754af2259c2de388dd3ff2a3cc6c6ba-kubeconfig\") pod \"kube-scheduler-ci-4578-0-0-p-2c3a114250\" (UID: \"d754af2259c2de388dd3ff2a3cc6c6ba\") " pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.917909 kubelet[2486]: I0114 01:16:59.917704 2486 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f7ad8f0024e49c19b755cfb9d0ad158e-k8s-certs\") pod \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" (UID: \"f7ad8f0024e49c19b755cfb9d0ad158e\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.949099 kubelet[2486]: I0114 01:16:59.949039 2486 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:16:59.949782 kubelet[2486]: E0114 01:16:59.949703 2486 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.79.167:6443/api/v1/nodes\": dial tcp 77.42.79.167:6443: connect: connection refused" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:00.120285 kubelet[2486]: E0114 01:17:00.120137 2486 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.79.167:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-2c3a114250?timeout=10s\": dial tcp 77.42.79.167:6443: connect: connection refused" interval="800ms" Jan 14 01:17:00.175833 containerd[1677]: time="2026-01-14T01:17:00.175739095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4578-0-0-p-2c3a114250,Uid:f7ad8f0024e49c19b755cfb9d0ad158e,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:00.193108 containerd[1677]: time="2026-01-14T01:17:00.193065462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4578-0-0-p-2c3a114250,Uid:6c8218ba496510a55706811a8336a2c6,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:00.206928 containerd[1677]: time="2026-01-14T01:17:00.206737958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4578-0-0-p-2c3a114250,Uid:d754af2259c2de388dd3ff2a3cc6c6ba,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:00.208431 containerd[1677]: time="2026-01-14T01:17:00.208411399Z" level=info msg="connecting to shim c9538fd43eb09f678f9ae222f7eb49ec5b4115ba83845303ab7dd10a3db79101" address="unix:///run/containerd/s/3bfef5288f2e495580a350037f4acd825494886a4d3b2d3e1105c8e5b56ccb3b" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:00.232971 containerd[1677]: time="2026-01-14T01:17:00.232531439Z" level=info msg="connecting to shim f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e" address="unix:///run/containerd/s/51f75a6a1db2f800d9509c36aa56baeb93cd8bb613eff19ecd20ca0273b33253" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:00.251295 systemd[1]: Started cri-containerd-c9538fd43eb09f678f9ae222f7eb49ec5b4115ba83845303ab7dd10a3db79101.scope - libcontainer container c9538fd43eb09f678f9ae222f7eb49ec5b4115ba83845303ab7dd10a3db79101. Jan 14 01:17:00.258660 containerd[1677]: time="2026-01-14T01:17:00.258628820Z" level=info msg="connecting to shim f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30" address="unix:///run/containerd/s/b81c031bfb5f4f100c3fd2621c451b66b36fdba020714d83593f519832a988f3" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:00.274882 systemd[1]: Started cri-containerd-f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e.scope - libcontainer container f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e. Jan 14 01:17:00.276000 audit: BPF prog-id=83 op=LOAD Jan 14 01:17:00.276000 audit: BPF prog-id=84 op=LOAD Jan 14 01:17:00.276000 audit[2541]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2529 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.276000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353338666434336562303966363738663961653232326637656234 Jan 14 01:17:00.277000 audit: BPF prog-id=84 op=UNLOAD Jan 14 01:17:00.277000 audit[2541]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353338666434336562303966363738663961653232326637656234 Jan 14 01:17:00.277000 audit: BPF prog-id=85 op=LOAD Jan 14 01:17:00.277000 audit[2541]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2529 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353338666434336562303966363738663961653232326637656234 Jan 14 01:17:00.277000 audit: BPF prog-id=86 op=LOAD Jan 14 01:17:00.277000 audit[2541]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2529 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.277000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353338666434336562303966363738663961653232326637656234 Jan 14 01:17:00.278000 audit: BPF prog-id=86 op=UNLOAD Jan 14 01:17:00.278000 audit[2541]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.278000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353338666434336562303966363738663961653232326637656234 Jan 14 01:17:00.278000 audit: BPF prog-id=85 op=UNLOAD Jan 14 01:17:00.278000 audit[2541]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.278000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353338666434336562303966363738663961653232326637656234 Jan 14 01:17:00.278000 audit: BPF prog-id=87 op=LOAD Jan 14 01:17:00.278000 audit[2541]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2529 pid=2541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.278000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353338666434336562303966363738663961653232326637656234 Jan 14 01:17:00.294769 systemd[1]: Started cri-containerd-f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30.scope - libcontainer container f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30. Jan 14 01:17:00.299000 audit: BPF prog-id=88 op=LOAD Jan 14 01:17:00.300000 audit: BPF prog-id=89 op=LOAD Jan 14 01:17:00.300000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2554 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.300000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635306433313264623761646337363466653161353731383439353531 Jan 14 01:17:00.300000 audit: BPF prog-id=89 op=UNLOAD Jan 14 01:17:00.300000 audit[2581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.300000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635306433313264623761646337363466653161353731383439353531 Jan 14 01:17:00.300000 audit: BPF prog-id=90 op=LOAD Jan 14 01:17:00.300000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2554 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.300000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635306433313264623761646337363466653161353731383439353531 Jan 14 01:17:00.301000 audit: BPF prog-id=91 op=LOAD Jan 14 01:17:00.301000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2554 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635306433313264623761646337363466653161353731383439353531 Jan 14 01:17:00.301000 audit: BPF prog-id=91 op=UNLOAD Jan 14 01:17:00.301000 audit[2581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635306433313264623761646337363466653161353731383439353531 Jan 14 01:17:00.301000 audit: BPF prog-id=90 op=UNLOAD Jan 14 01:17:00.301000 audit[2581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635306433313264623761646337363466653161353731383439353531 Jan 14 01:17:00.301000 audit: BPF prog-id=92 op=LOAD Jan 14 01:17:00.301000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2554 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635306433313264623761646337363466653161353731383439353531 Jan 14 01:17:00.317870 containerd[1677]: time="2026-01-14T01:17:00.317829534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4578-0-0-p-2c3a114250,Uid:f7ad8f0024e49c19b755cfb9d0ad158e,Namespace:kube-system,Attempt:0,} returns sandbox id \"c9538fd43eb09f678f9ae222f7eb49ec5b4115ba83845303ab7dd10a3db79101\"" Jan 14 01:17:00.319000 audit: BPF prog-id=93 op=LOAD Jan 14 01:17:00.320000 audit: BPF prog-id=94 op=LOAD Jan 14 01:17:00.320000 audit[2612]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2586 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635346339303164353831356663366535636534396666326234353162 Jan 14 01:17:00.320000 audit: BPF prog-id=94 op=UNLOAD Jan 14 01:17:00.320000 audit[2612]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635346339303164353831356663366535636534396666326234353162 Jan 14 01:17:00.320000 audit: BPF prog-id=95 op=LOAD Jan 14 01:17:00.320000 audit[2612]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2586 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635346339303164353831356663366535636534396666326234353162 Jan 14 01:17:00.320000 audit: BPF prog-id=96 op=LOAD Jan 14 01:17:00.320000 audit[2612]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2586 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635346339303164353831356663366535636534396666326234353162 Jan 14 01:17:00.320000 audit: BPF prog-id=96 op=UNLOAD Jan 14 01:17:00.320000 audit[2612]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635346339303164353831356663366535636534396666326234353162 Jan 14 01:17:00.321000 audit: BPF prog-id=95 op=UNLOAD Jan 14 01:17:00.321000 audit[2612]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635346339303164353831356663366535636534396666326234353162 Jan 14 01:17:00.321000 audit: BPF prog-id=97 op=LOAD Jan 14 01:17:00.321000 audit[2612]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2586 pid=2612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635346339303164353831356663366535636534396666326234353162 Jan 14 01:17:00.324013 containerd[1677]: time="2026-01-14T01:17:00.323972217Z" level=info msg="CreateContainer within sandbox \"c9538fd43eb09f678f9ae222f7eb49ec5b4115ba83845303ab7dd10a3db79101\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 14 01:17:00.329456 kubelet[2486]: E0114 01:17:00.329418 2486 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.79.167:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4578-0-0-p-2c3a114250&limit=500&resourceVersion=0\": dial tcp 77.42.79.167:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 14 01:17:00.333231 containerd[1677]: time="2026-01-14T01:17:00.333192981Z" level=info msg="Container 1d3acf2664c9dd4e4261ed9362a6a998c475389aac8fe96d347123014883f448: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:00.338781 containerd[1677]: time="2026-01-14T01:17:00.338754563Z" level=info msg="CreateContainer within sandbox \"c9538fd43eb09f678f9ae222f7eb49ec5b4115ba83845303ab7dd10a3db79101\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1d3acf2664c9dd4e4261ed9362a6a998c475389aac8fe96d347123014883f448\"" Jan 14 01:17:00.340668 containerd[1677]: time="2026-01-14T01:17:00.340647234Z" level=info msg="StartContainer for \"1d3acf2664c9dd4e4261ed9362a6a998c475389aac8fe96d347123014883f448\"" Jan 14 01:17:00.341443 containerd[1677]: time="2026-01-14T01:17:00.341422504Z" level=info msg="connecting to shim 1d3acf2664c9dd4e4261ed9362a6a998c475389aac8fe96d347123014883f448" address="unix:///run/containerd/s/3bfef5288f2e495580a350037f4acd825494886a4d3b2d3e1105c8e5b56ccb3b" protocol=ttrpc version=3 Jan 14 01:17:00.352926 kubelet[2486]: I0114 01:17:00.352328 2486 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:00.353775 kubelet[2486]: E0114 01:17:00.353743 2486 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.79.167:6443/api/v1/nodes\": dial tcp 77.42.79.167:6443: connect: connection refused" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:00.369328 systemd[1]: Started cri-containerd-1d3acf2664c9dd4e4261ed9362a6a998c475389aac8fe96d347123014883f448.scope - libcontainer container 1d3acf2664c9dd4e4261ed9362a6a998c475389aac8fe96d347123014883f448. Jan 14 01:17:00.376325 containerd[1677]: time="2026-01-14T01:17:00.376208349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4578-0-0-p-2c3a114250,Uid:d754af2259c2de388dd3ff2a3cc6c6ba,Namespace:kube-system,Attempt:0,} returns sandbox id \"f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30\"" Jan 14 01:17:00.381775 containerd[1677]: time="2026-01-14T01:17:00.381736391Z" level=info msg="CreateContainer within sandbox \"f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 14 01:17:00.383124 containerd[1677]: time="2026-01-14T01:17:00.383095202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4578-0-0-p-2c3a114250,Uid:6c8218ba496510a55706811a8336a2c6,Namespace:kube-system,Attempt:0,} returns sandbox id \"f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e\"" Jan 14 01:17:00.391000 audit: BPF prog-id=98 op=LOAD Jan 14 01:17:00.391000 audit: BPF prog-id=99 op=LOAD Jan 14 01:17:00.391000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=2529 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.391000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164336163663236363463396464346534323631656439333632613661 Jan 14 01:17:00.392000 audit: BPF prog-id=99 op=UNLOAD Jan 14 01:17:00.392000 audit[2645]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.392000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164336163663236363463396464346534323631656439333632613661 Jan 14 01:17:00.392000 audit: BPF prog-id=100 op=LOAD Jan 14 01:17:00.392000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=2529 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.392000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164336163663236363463396464346534323631656439333632613661 Jan 14 01:17:00.392000 audit: BPF prog-id=101 op=LOAD Jan 14 01:17:00.392000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=2529 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.392000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164336163663236363463396464346534323631656439333632613661 Jan 14 01:17:00.393000 audit: BPF prog-id=101 op=UNLOAD Jan 14 01:17:00.393000 audit[2645]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164336163663236363463396464346534323631656439333632613661 Jan 14 01:17:00.393000 audit: BPF prog-id=100 op=UNLOAD Jan 14 01:17:00.393000 audit[2645]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164336163663236363463396464346534323631656439333632613661 Jan 14 01:17:00.393000 audit: BPF prog-id=102 op=LOAD Jan 14 01:17:00.393000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=2529 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164336163663236363463396464346534323631656439333632613661 Jan 14 01:17:00.396516 containerd[1677]: time="2026-01-14T01:17:00.396460867Z" level=info msg="Container 796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:00.399432 containerd[1677]: time="2026-01-14T01:17:00.399398248Z" level=info msg="CreateContainer within sandbox \"f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 14 01:17:00.404210 containerd[1677]: time="2026-01-14T01:17:00.404160050Z" level=info msg="CreateContainer within sandbox \"f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42\"" Jan 14 01:17:00.405928 containerd[1677]: time="2026-01-14T01:17:00.405873721Z" level=info msg="StartContainer for \"796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42\"" Jan 14 01:17:00.408071 containerd[1677]: time="2026-01-14T01:17:00.408033772Z" level=info msg="Container 3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:00.410031 containerd[1677]: time="2026-01-14T01:17:00.408858722Z" level=info msg="connecting to shim 796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42" address="unix:///run/containerd/s/b81c031bfb5f4f100c3fd2621c451b66b36fdba020714d83593f519832a988f3" protocol=ttrpc version=3 Jan 14 01:17:00.417558 containerd[1677]: time="2026-01-14T01:17:00.417484926Z" level=info msg="CreateContainer within sandbox \"f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200\"" Jan 14 01:17:00.420137 containerd[1677]: time="2026-01-14T01:17:00.420091997Z" level=info msg="StartContainer for \"3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200\"" Jan 14 01:17:00.426242 containerd[1677]: time="2026-01-14T01:17:00.425096239Z" level=info msg="connecting to shim 3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200" address="unix:///run/containerd/s/51f75a6a1db2f800d9509c36aa56baeb93cd8bb613eff19ecd20ca0273b33253" protocol=ttrpc version=3 Jan 14 01:17:00.450355 systemd[1]: Started cri-containerd-796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42.scope - libcontainer container 796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42. Jan 14 01:17:00.461253 systemd[1]: Started cri-containerd-3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200.scope - libcontainer container 3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200. Jan 14 01:17:00.470524 containerd[1677]: time="2026-01-14T01:17:00.470439228Z" level=info msg="StartContainer for \"1d3acf2664c9dd4e4261ed9362a6a998c475389aac8fe96d347123014883f448\" returns successfully" Jan 14 01:17:00.484000 audit: BPF prog-id=103 op=LOAD Jan 14 01:17:00.485000 audit: BPF prog-id=104 op=LOAD Jan 14 01:17:00.485000 audit[2688]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2554 pid=2688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361303162373361636666333035616639333731613464356563376564 Jan 14 01:17:00.485000 audit: BPF prog-id=104 op=UNLOAD Jan 14 01:17:00.485000 audit[2688]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=2688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361303162373361636666333035616639333731613464356563376564 Jan 14 01:17:00.485000 audit: BPF prog-id=105 op=LOAD Jan 14 01:17:00.485000 audit[2688]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2554 pid=2688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361303162373361636666333035616639333731613464356563376564 Jan 14 01:17:00.485000 audit: BPF prog-id=106 op=LOAD Jan 14 01:17:00.485000 audit[2688]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2554 pid=2688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361303162373361636666333035616639333731613464356563376564 Jan 14 01:17:00.485000 audit: BPF prog-id=106 op=UNLOAD Jan 14 01:17:00.485000 audit[2688]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=2688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361303162373361636666333035616639333731613464356563376564 Jan 14 01:17:00.485000 audit: BPF prog-id=105 op=UNLOAD Jan 14 01:17:00.485000 audit[2688]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=2688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361303162373361636666333035616639333731613464356563376564 Jan 14 01:17:00.485000 audit: BPF prog-id=107 op=LOAD Jan 14 01:17:00.485000 audit[2688]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2554 pid=2688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361303162373361636666333035616639333731613464356563376564 Jan 14 01:17:00.496000 audit: BPF prog-id=108 op=LOAD Jan 14 01:17:00.496000 audit: BPF prog-id=109 op=LOAD Jan 14 01:17:00.496000 audit[2682]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000174238 a2=98 a3=0 items=0 ppid=2586 pid=2682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739363536336139353937613134343563356566333131396538623164 Jan 14 01:17:00.496000 audit: BPF prog-id=109 op=UNLOAD Jan 14 01:17:00.496000 audit[2682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=2682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739363536336139353937613134343563356566333131396538623164 Jan 14 01:17:00.496000 audit: BPF prog-id=110 op=LOAD Jan 14 01:17:00.496000 audit[2682]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000174488 a2=98 a3=0 items=0 ppid=2586 pid=2682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739363536336139353937613134343563356566333131396538623164 Jan 14 01:17:00.496000 audit: BPF prog-id=111 op=LOAD Jan 14 01:17:00.496000 audit[2682]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000174218 a2=98 a3=0 items=0 ppid=2586 pid=2682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739363536336139353937613134343563356566333131396538623164 Jan 14 01:17:00.497000 audit: BPF prog-id=111 op=UNLOAD Jan 14 01:17:00.497000 audit[2682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=2682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739363536336139353937613134343563356566333131396538623164 Jan 14 01:17:00.497000 audit: BPF prog-id=110 op=UNLOAD Jan 14 01:17:00.497000 audit[2682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=2682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739363536336139353937613134343563356566333131396538623164 Jan 14 01:17:00.497000 audit: BPF prog-id=112 op=LOAD Jan 14 01:17:00.497000 audit[2682]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001746e8 a2=98 a3=0 items=0 ppid=2586 pid=2682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:00.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739363536336139353937613134343563356566333131396538623164 Jan 14 01:17:00.536040 containerd[1677]: time="2026-01-14T01:17:00.535471215Z" level=info msg="StartContainer for \"3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200\" returns successfully" Jan 14 01:17:00.556837 kubelet[2486]: E0114 01:17:00.556804 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:00.558559 kubelet[2486]: E0114 01:17:00.558530 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:00.602965 containerd[1677]: time="2026-01-14T01:17:00.602929053Z" level=info msg="StartContainer for \"796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42\" returns successfully" Jan 14 01:17:01.157197 kubelet[2486]: I0114 01:17:01.157142 2486 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:01.561704 kubelet[2486]: E0114 01:17:01.561675 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:01.563201 kubelet[2486]: E0114 01:17:01.563184 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:01.972176 kubelet[2486]: E0114 01:17:01.972132 2486 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:02.106351 kubelet[2486]: I0114 01:17:02.106303 2486 kubelet_node_status.go:78] "Successfully registered node" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:02.106477 kubelet[2486]: E0114 01:17:02.106365 2486 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4578-0-0-p-2c3a114250\": node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.140229 kubelet[2486]: E0114 01:17:02.140197 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.240808 kubelet[2486]: E0114 01:17:02.240703 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.341453 kubelet[2486]: E0114 01:17:02.341396 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.442307 kubelet[2486]: E0114 01:17:02.442219 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.543135 kubelet[2486]: E0114 01:17:02.542984 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.564390 kubelet[2486]: E0114 01:17:02.564339 2486 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-2c3a114250\" not found" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:02.643520 kubelet[2486]: E0114 01:17:02.643448 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.744213 kubelet[2486]: E0114 01:17:02.744122 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.845324 kubelet[2486]: E0114 01:17:02.845215 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:02.946308 kubelet[2486]: E0114 01:17:02.946228 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:03.047373 kubelet[2486]: E0114 01:17:03.047312 2486 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:03.067538 kubelet[2486]: I0114 01:17:03.067486 2486 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:03.118827 kubelet[2486]: I0114 01:17:03.118202 2486 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:03.124307 kubelet[2486]: E0114 01:17:03.124234 2486 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" already exists" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:03.124307 kubelet[2486]: I0114 01:17:03.124265 2486 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:03.129493 kubelet[2486]: I0114 01:17:03.129431 2486 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:03.497402 kubelet[2486]: I0114 01:17:03.497352 2486 apiserver.go:52] "Watching apiserver" Jan 14 01:17:03.516485 kubelet[2486]: I0114 01:17:03.516426 2486 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 01:17:03.565480 kubelet[2486]: I0114 01:17:03.565423 2486 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:03.572696 kubelet[2486]: E0114 01:17:03.572512 2486 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4578-0-0-p-2c3a114250\" already exists" pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:04.218279 systemd[1]: Reload requested from client PID 2766 ('systemctl') (unit session-8.scope)... Jan 14 01:17:04.218313 systemd[1]: Reloading... Jan 14 01:17:04.382050 zram_generator::config[2813]: No configuration found. Jan 14 01:17:04.571745 systemd[1]: Reloading finished in 352 ms. Jan 14 01:17:04.601224 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:17:04.619313 systemd[1]: kubelet.service: Deactivated successfully. Jan 14 01:17:04.633497 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 14 01:17:04.633651 kernel: audit: type=1131 audit(1768353424.618:402): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:17:04.618000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:17:04.619646 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:17:04.619714 systemd[1]: kubelet.service: Consumed 667ms CPU time, 130.3M memory peak. Jan 14 01:17:04.624000 audit: BPF prog-id=113 op=LOAD Jan 14 01:17:04.623176 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:17:04.637134 kernel: audit: type=1334 audit(1768353424.624:403): prog-id=113 op=LOAD Jan 14 01:17:04.643116 kernel: audit: type=1334 audit(1768353424.624:404): prog-id=82 op=UNLOAD Jan 14 01:17:04.624000 audit: BPF prog-id=82 op=UNLOAD Jan 14 01:17:04.624000 audit: BPF prog-id=114 op=LOAD Jan 14 01:17:04.650055 kernel: audit: type=1334 audit(1768353424.624:405): prog-id=114 op=LOAD Jan 14 01:17:04.624000 audit: BPF prog-id=72 op=UNLOAD Jan 14 01:17:04.655042 kernel: audit: type=1334 audit(1768353424.624:406): prog-id=72 op=UNLOAD Jan 14 01:17:04.624000 audit: BPF prog-id=115 op=LOAD Jan 14 01:17:04.663958 kernel: audit: type=1334 audit(1768353424.624:407): prog-id=115 op=LOAD Jan 14 01:17:04.664207 kernel: audit: type=1334 audit(1768353424.624:408): prog-id=116 op=LOAD Jan 14 01:17:04.624000 audit: BPF prog-id=116 op=LOAD Jan 14 01:17:04.624000 audit: BPF prog-id=73 op=UNLOAD Jan 14 01:17:04.670038 kernel: audit: type=1334 audit(1768353424.624:409): prog-id=73 op=UNLOAD Jan 14 01:17:04.670080 kernel: audit: type=1334 audit(1768353424.624:410): prog-id=74 op=UNLOAD Jan 14 01:17:04.624000 audit: BPF prog-id=74 op=UNLOAD Jan 14 01:17:04.624000 audit: BPF prog-id=117 op=LOAD Jan 14 01:17:04.675055 kernel: audit: type=1334 audit(1768353424.624:411): prog-id=117 op=LOAD Jan 14 01:17:04.624000 audit: BPF prog-id=118 op=LOAD Jan 14 01:17:04.624000 audit: BPF prog-id=79 op=UNLOAD Jan 14 01:17:04.624000 audit: BPF prog-id=80 op=UNLOAD Jan 14 01:17:04.629000 audit: BPF prog-id=119 op=LOAD Jan 14 01:17:04.629000 audit: BPF prog-id=66 op=UNLOAD Jan 14 01:17:04.629000 audit: BPF prog-id=120 op=LOAD Jan 14 01:17:04.629000 audit: BPF prog-id=121 op=LOAD Jan 14 01:17:04.629000 audit: BPF prog-id=67 op=UNLOAD Jan 14 01:17:04.629000 audit: BPF prog-id=68 op=UNLOAD Jan 14 01:17:04.629000 audit: BPF prog-id=122 op=LOAD Jan 14 01:17:04.629000 audit: BPF prog-id=81 op=UNLOAD Jan 14 01:17:04.629000 audit: BPF prog-id=123 op=LOAD Jan 14 01:17:04.629000 audit: BPF prog-id=63 op=UNLOAD Jan 14 01:17:04.629000 audit: BPF prog-id=124 op=LOAD Jan 14 01:17:04.629000 audit: BPF prog-id=125 op=LOAD Jan 14 01:17:04.629000 audit: BPF prog-id=64 op=UNLOAD Jan 14 01:17:04.629000 audit: BPF prog-id=65 op=UNLOAD Jan 14 01:17:04.637000 audit: BPF prog-id=126 op=LOAD Jan 14 01:17:04.637000 audit: BPF prog-id=78 op=UNLOAD Jan 14 01:17:04.638000 audit: BPF prog-id=127 op=LOAD Jan 14 01:17:04.638000 audit: BPF prog-id=75 op=UNLOAD Jan 14 01:17:04.638000 audit: BPF prog-id=128 op=LOAD Jan 14 01:17:04.638000 audit: BPF prog-id=129 op=LOAD Jan 14 01:17:04.638000 audit: BPF prog-id=76 op=UNLOAD Jan 14 01:17:04.638000 audit: BPF prog-id=77 op=UNLOAD Jan 14 01:17:04.642000 audit: BPF prog-id=130 op=LOAD Jan 14 01:17:04.642000 audit: BPF prog-id=69 op=UNLOAD Jan 14 01:17:04.642000 audit: BPF prog-id=131 op=LOAD Jan 14 01:17:04.642000 audit: BPF prog-id=132 op=LOAD Jan 14 01:17:04.642000 audit: BPF prog-id=70 op=UNLOAD Jan 14 01:17:04.642000 audit: BPF prog-id=71 op=UNLOAD Jan 14 01:17:04.840000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:17:04.841587 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:17:04.857684 (kubelet)[2863]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:17:04.915655 kubelet[2863]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:17:04.915655 kubelet[2863]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:17:04.915655 kubelet[2863]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:17:04.916177 kubelet[2863]: I0114 01:17:04.915672 2863 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:17:04.925069 kubelet[2863]: I0114 01:17:04.924876 2863 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 14 01:17:04.925069 kubelet[2863]: I0114 01:17:04.924904 2863 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:17:04.925525 kubelet[2863]: I0114 01:17:04.925511 2863 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 01:17:04.927409 kubelet[2863]: I0114 01:17:04.927392 2863 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 14 01:17:04.929676 kubelet[2863]: I0114 01:17:04.929657 2863 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:17:04.935191 kubelet[2863]: I0114 01:17:04.935111 2863 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:17:04.941033 kubelet[2863]: I0114 01:17:04.940724 2863 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 01:17:04.941033 kubelet[2863]: I0114 01:17:04.940951 2863 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:17:04.941260 kubelet[2863]: I0114 01:17:04.940971 2863 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4578-0-0-p-2c3a114250","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:17:04.941373 kubelet[2863]: I0114 01:17:04.941364 2863 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:17:04.941427 kubelet[2863]: I0114 01:17:04.941420 2863 container_manager_linux.go:303] "Creating device plugin manager" Jan 14 01:17:04.941508 kubelet[2863]: I0114 01:17:04.941501 2863 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:17:04.941749 kubelet[2863]: I0114 01:17:04.941739 2863 kubelet.go:480] "Attempting to sync node with API server" Jan 14 01:17:04.941803 kubelet[2863]: I0114 01:17:04.941795 2863 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:17:04.941928 kubelet[2863]: I0114 01:17:04.941921 2863 kubelet.go:386] "Adding apiserver pod source" Jan 14 01:17:04.941983 kubelet[2863]: I0114 01:17:04.941975 2863 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:17:04.946516 kubelet[2863]: I0114 01:17:04.946484 2863 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:17:04.947195 kubelet[2863]: I0114 01:17:04.947177 2863 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 01:17:04.956713 kubelet[2863]: I0114 01:17:04.956428 2863 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 01:17:04.956713 kubelet[2863]: I0114 01:17:04.956478 2863 server.go:1289] "Started kubelet" Jan 14 01:17:04.961020 kubelet[2863]: I0114 01:17:04.960263 2863 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:17:04.961810 kubelet[2863]: I0114 01:17:04.961784 2863 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:17:04.963212 kubelet[2863]: I0114 01:17:04.962891 2863 server.go:317] "Adding debug handlers to kubelet server" Jan 14 01:17:04.968718 kubelet[2863]: I0114 01:17:04.968671 2863 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:17:04.968983 kubelet[2863]: I0114 01:17:04.968971 2863 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:17:04.969077 kubelet[2863]: I0114 01:17:04.968759 2863 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:17:04.972811 kubelet[2863]: I0114 01:17:04.972782 2863 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 01:17:04.973052 kubelet[2863]: E0114 01:17:04.973031 2863 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-2c3a114250\" not found" Jan 14 01:17:04.974129 kubelet[2863]: I0114 01:17:04.974104 2863 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 01:17:04.974308 kubelet[2863]: I0114 01:17:04.974290 2863 reconciler.go:26] "Reconciler: start to sync state" Jan 14 01:17:04.980280 kubelet[2863]: I0114 01:17:04.980247 2863 factory.go:223] Registration of the systemd container factory successfully Jan 14 01:17:04.980401 kubelet[2863]: I0114 01:17:04.980377 2863 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:17:04.983208 kubelet[2863]: I0114 01:17:04.983185 2863 factory.go:223] Registration of the containerd container factory successfully Jan 14 01:17:04.987024 kubelet[2863]: I0114 01:17:04.986363 2863 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 14 01:17:04.989399 kubelet[2863]: I0114 01:17:04.989341 2863 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 14 01:17:04.989399 kubelet[2863]: I0114 01:17:04.989398 2863 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 14 01:17:04.989487 kubelet[2863]: I0114 01:17:04.989423 2863 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:17:04.989487 kubelet[2863]: I0114 01:17:04.989436 2863 kubelet.go:2436] "Starting kubelet main sync loop" Jan 14 01:17:04.989546 kubelet[2863]: E0114 01:17:04.989498 2863 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:17:04.998607 kubelet[2863]: E0114 01:17:04.998577 2863 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 14 01:17:05.035533 kubelet[2863]: I0114 01:17:05.035498 2863 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:17:05.035689 kubelet[2863]: I0114 01:17:05.035565 2863 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:17:05.035689 kubelet[2863]: I0114 01:17:05.035582 2863 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:17:05.035689 kubelet[2863]: I0114 01:17:05.035685 2863 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 14 01:17:05.035754 kubelet[2863]: I0114 01:17:05.035691 2863 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 14 01:17:05.035754 kubelet[2863]: I0114 01:17:05.035703 2863 policy_none.go:49] "None policy: Start" Jan 14 01:17:05.035754 kubelet[2863]: I0114 01:17:05.035711 2863 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 01:17:05.035754 kubelet[2863]: I0114 01:17:05.035719 2863 state_mem.go:35] "Initializing new in-memory state store" Jan 14 01:17:05.035841 kubelet[2863]: I0114 01:17:05.035783 2863 state_mem.go:75] "Updated machine memory state" Jan 14 01:17:05.039490 kubelet[2863]: E0114 01:17:05.039471 2863 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 01:17:05.039996 kubelet[2863]: I0114 01:17:05.039979 2863 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:17:05.040071 kubelet[2863]: I0114 01:17:05.039995 2863 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:17:05.041024 kubelet[2863]: I0114 01:17:05.040214 2863 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:17:05.041515 kubelet[2863]: E0114 01:17:05.041495 2863 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:17:05.090636 kubelet[2863]: I0114 01:17:05.090586 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.090997 kubelet[2863]: I0114 01:17:05.090940 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.091248 kubelet[2863]: I0114 01:17:05.091212 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.100268 kubelet[2863]: E0114 01:17:05.100112 2863 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" already exists" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.100268 kubelet[2863]: E0114 01:17:05.100207 2863 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4578-0-0-p-2c3a114250\" already exists" pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.102445 kubelet[2863]: E0114 01:17:05.102401 2863 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" already exists" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.143709 kubelet[2863]: I0114 01:17:05.143519 2863 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.150898 kubelet[2863]: I0114 01:17:05.150864 2863 kubelet_node_status.go:124] "Node was previously registered" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.151234 kubelet[2863]: I0114 01:17:05.151212 2863 kubelet_node_status.go:78] "Successfully registered node" node="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275494 kubelet[2863]: I0114 01:17:05.275354 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f7ad8f0024e49c19b755cfb9d0ad158e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" (UID: \"f7ad8f0024e49c19b755cfb9d0ad158e\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275494 kubelet[2863]: I0114 01:17:05.275427 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d754af2259c2de388dd3ff2a3cc6c6ba-kubeconfig\") pod \"kube-scheduler-ci-4578-0-0-p-2c3a114250\" (UID: \"d754af2259c2de388dd3ff2a3cc6c6ba\") " pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275494 kubelet[2863]: I0114 01:17:05.275467 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f7ad8f0024e49c19b755cfb9d0ad158e-ca-certs\") pod \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" (UID: \"f7ad8f0024e49c19b755cfb9d0ad158e\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275847 kubelet[2863]: I0114 01:17:05.275502 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-ca-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275847 kubelet[2863]: I0114 01:17:05.275540 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-flexvolume-dir\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275847 kubelet[2863]: I0114 01:17:05.275599 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-k8s-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275847 kubelet[2863]: I0114 01:17:05.275633 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-kubeconfig\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.275847 kubelet[2863]: I0114 01:17:05.275670 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6c8218ba496510a55706811a8336a2c6-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" (UID: \"6c8218ba496510a55706811a8336a2c6\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.276161 kubelet[2863]: I0114 01:17:05.275704 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f7ad8f0024e49c19b755cfb9d0ad158e-k8s-certs\") pod \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" (UID: \"f7ad8f0024e49c19b755cfb9d0ad158e\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:05.945078 kubelet[2863]: I0114 01:17:05.944363 2863 apiserver.go:52] "Watching apiserver" Jan 14 01:17:05.974825 kubelet[2863]: I0114 01:17:05.974742 2863 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 01:17:05.991699 kubelet[2863]: I0114 01:17:05.991638 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4578-0-0-p-2c3a114250" podStartSLOduration=2.9916247670000002 podStartE2EDuration="2.991624767s" podCreationTimestamp="2026-01-14 01:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:17:05.991505957 +0000 UTC m=+1.126425210" watchObservedRunningTime="2026-01-14 01:17:05.991624767 +0000 UTC m=+1.126544030" Jan 14 01:17:06.005240 kubelet[2863]: I0114 01:17:06.004976 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" podStartSLOduration=3.004962363 podStartE2EDuration="3.004962363s" podCreationTimestamp="2026-01-14 01:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:17:06.002730222 +0000 UTC m=+1.137649475" watchObservedRunningTime="2026-01-14 01:17:06.004962363 +0000 UTC m=+1.139881626" Jan 14 01:17:06.013618 kubelet[2863]: I0114 01:17:06.013343 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" podStartSLOduration=3.013330486 podStartE2EDuration="3.013330486s" podCreationTimestamp="2026-01-14 01:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:17:06.013186556 +0000 UTC m=+1.148105809" watchObservedRunningTime="2026-01-14 01:17:06.013330486 +0000 UTC m=+1.148249739" Jan 14 01:17:06.016695 kubelet[2863]: I0114 01:17:06.016511 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:06.018013 kubelet[2863]: I0114 01:17:06.017053 2863 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:06.029190 kubelet[2863]: E0114 01:17:06.029163 2863 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4578-0-0-p-2c3a114250\" already exists" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:06.029502 kubelet[2863]: E0114 01:17:06.029332 2863 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4578-0-0-p-2c3a114250\" already exists" pod="kube-system/kube-apiserver-ci-4578-0-0-p-2c3a114250" Jan 14 01:17:09.742867 kubelet[2863]: I0114 01:17:09.742673 2863 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 14 01:17:09.743852 containerd[1677]: time="2026-01-14T01:17:09.743155660Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 14 01:17:09.744409 kubelet[2863]: I0114 01:17:09.743699 2863 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 14 01:17:10.549313 systemd[1]: Created slice kubepods-besteffort-pod0bb84a83_4d01_4c1a_8cae_3a6238905074.slice - libcontainer container kubepods-besteffort-pod0bb84a83_4d01_4c1a_8cae_3a6238905074.slice. Jan 14 01:17:10.615407 kubelet[2863]: I0114 01:17:10.615346 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0bb84a83-4d01-4c1a-8cae-3a6238905074-xtables-lock\") pod \"kube-proxy-qkkjf\" (UID: \"0bb84a83-4d01-4c1a-8cae-3a6238905074\") " pod="kube-system/kube-proxy-qkkjf" Jan 14 01:17:10.615407 kubelet[2863]: I0114 01:17:10.615397 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0bb84a83-4d01-4c1a-8cae-3a6238905074-lib-modules\") pod \"kube-proxy-qkkjf\" (UID: \"0bb84a83-4d01-4c1a-8cae-3a6238905074\") " pod="kube-system/kube-proxy-qkkjf" Jan 14 01:17:10.615681 kubelet[2863]: I0114 01:17:10.615438 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0bb84a83-4d01-4c1a-8cae-3a6238905074-kube-proxy\") pod \"kube-proxy-qkkjf\" (UID: \"0bb84a83-4d01-4c1a-8cae-3a6238905074\") " pod="kube-system/kube-proxy-qkkjf" Jan 14 01:17:10.615681 kubelet[2863]: I0114 01:17:10.615464 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v28fx\" (UniqueName: \"kubernetes.io/projected/0bb84a83-4d01-4c1a-8cae-3a6238905074-kube-api-access-v28fx\") pod \"kube-proxy-qkkjf\" (UID: \"0bb84a83-4d01-4c1a-8cae-3a6238905074\") " pod="kube-system/kube-proxy-qkkjf" Jan 14 01:17:10.659714 systemd[1]: Created slice kubepods-besteffort-pod1a843e5a_8b6c_4213_b568_9d2f828dc754.slice - libcontainer container kubepods-besteffort-pod1a843e5a_8b6c_4213_b568_9d2f828dc754.slice. Jan 14 01:17:10.718083 kubelet[2863]: I0114 01:17:10.716276 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w98n\" (UniqueName: \"kubernetes.io/projected/1a843e5a-8b6c-4213-b568-9d2f828dc754-kube-api-access-7w98n\") pod \"tigera-operator-7dcd859c48-rblzk\" (UID: \"1a843e5a-8b6c-4213-b568-9d2f828dc754\") " pod="tigera-operator/tigera-operator-7dcd859c48-rblzk" Jan 14 01:17:10.718083 kubelet[2863]: I0114 01:17:10.716449 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1a843e5a-8b6c-4213-b568-9d2f828dc754-var-lib-calico\") pod \"tigera-operator-7dcd859c48-rblzk\" (UID: \"1a843e5a-8b6c-4213-b568-9d2f828dc754\") " pod="tigera-operator/tigera-operator-7dcd859c48-rblzk" Jan 14 01:17:10.866288 containerd[1677]: time="2026-01-14T01:17:10.866117178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qkkjf,Uid:0bb84a83-4d01-4c1a-8cae-3a6238905074,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:10.892762 containerd[1677]: time="2026-01-14T01:17:10.892649639Z" level=info msg="connecting to shim b689e25c0f635b395a918a9b0d8ded15514b8138b64ef081c08e8d66b024e38c" address="unix:///run/containerd/s/b2ff555ab5d9a9eb7c5cda145ffce1411a20ef58f331925f2e43526b1ed4e699" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:10.933392 systemd[1]: Started cri-containerd-b689e25c0f635b395a918a9b0d8ded15514b8138b64ef081c08e8d66b024e38c.scope - libcontainer container b689e25c0f635b395a918a9b0d8ded15514b8138b64ef081c08e8d66b024e38c. Jan 14 01:17:10.951060 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 14 01:17:10.951190 kernel: audit: type=1334 audit(1768353430.947:444): prog-id=133 op=LOAD Jan 14 01:17:10.947000 audit: BPF prog-id=133 op=LOAD Jan 14 01:17:10.955154 kernel: audit: type=1334 audit(1768353430.947:445): prog-id=134 op=LOAD Jan 14 01:17:10.947000 audit: BPF prog-id=134 op=LOAD Jan 14 01:17:10.947000 audit[2932]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:10.960686 kernel: audit: type=1300 audit(1768353430.947:445): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:10.968766 containerd[1677]: time="2026-01-14T01:17:10.968725480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-rblzk,Uid:1a843e5a-8b6c-4213-b568-9d2f828dc754,Namespace:tigera-operator,Attempt:0,}" Jan 14 01:17:10.947000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.975140 kernel: audit: type=1327 audit(1768353430.947:445): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.948000 audit: BPF prog-id=134 op=UNLOAD Jan 14 01:17:10.997085 kernel: audit: type=1334 audit(1768353430.948:446): prog-id=134 op=UNLOAD Jan 14 01:17:10.948000 audit[2932]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.014225 kernel: audit: type=1300 audit(1768353430.948:446): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.014336 kernel: audit: type=1327 audit(1768353430.948:446): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.948000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.948000 audit: BPF prog-id=135 op=LOAD Jan 14 01:17:11.025393 kernel: audit: type=1334 audit(1768353430.948:447): prog-id=135 op=LOAD Jan 14 01:17:11.025548 kernel: audit: type=1300 audit(1768353430.948:447): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:10.948000 audit[2932]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.033306 kernel: audit: type=1327 audit(1768353430.948:447): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.948000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:11.033488 containerd[1677]: time="2026-01-14T01:17:11.027411825Z" level=info msg="connecting to shim 4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d" address="unix:///run/containerd/s/ae10d5e669d8a94f38783b7b4a0ca9944c153ccc850bf49e719b33f66c2b4d47" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:10.948000 audit: BPF prog-id=136 op=LOAD Jan 14 01:17:10.948000 audit[2932]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:10.948000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.948000 audit: BPF prog-id=136 op=UNLOAD Jan 14 01:17:10.948000 audit[2932]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:10.948000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.948000 audit: BPF prog-id=135 op=UNLOAD Jan 14 01:17:10.948000 audit[2932]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:10.948000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:10.948000 audit: BPF prog-id=137 op=LOAD Jan 14 01:17:10.948000 audit[2932]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2921 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:10.948000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236383965323563306636333562333935613931386139623064386465 Jan 14 01:17:11.039430 containerd[1677]: time="2026-01-14T01:17:11.039377470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qkkjf,Uid:0bb84a83-4d01-4c1a-8cae-3a6238905074,Namespace:kube-system,Attempt:0,} returns sandbox id \"b689e25c0f635b395a918a9b0d8ded15514b8138b64ef081c08e8d66b024e38c\"" Jan 14 01:17:11.045210 containerd[1677]: time="2026-01-14T01:17:11.045112632Z" level=info msg="CreateContainer within sandbox \"b689e25c0f635b395a918a9b0d8ded15514b8138b64ef081c08e8d66b024e38c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 14 01:17:11.057730 containerd[1677]: time="2026-01-14T01:17:11.057655737Z" level=info msg="Container c94082e0cc08bd79b2b3cf0402fef2087fd6a6f7ad2b939a36c1b08c45a61873: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:11.062117 systemd[1]: Started cri-containerd-4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d.scope - libcontainer container 4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d. Jan 14 01:17:11.070197 containerd[1677]: time="2026-01-14T01:17:11.070153613Z" level=info msg="CreateContainer within sandbox \"b689e25c0f635b395a918a9b0d8ded15514b8138b64ef081c08e8d66b024e38c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c94082e0cc08bd79b2b3cf0402fef2087fd6a6f7ad2b939a36c1b08c45a61873\"" Jan 14 01:17:11.071758 containerd[1677]: time="2026-01-14T01:17:11.071688843Z" level=info msg="StartContainer for \"c94082e0cc08bd79b2b3cf0402fef2087fd6a6f7ad2b939a36c1b08c45a61873\"" Jan 14 01:17:11.073844 containerd[1677]: time="2026-01-14T01:17:11.073818534Z" level=info msg="connecting to shim c94082e0cc08bd79b2b3cf0402fef2087fd6a6f7ad2b939a36c1b08c45a61873" address="unix:///run/containerd/s/b2ff555ab5d9a9eb7c5cda145ffce1411a20ef58f331925f2e43526b1ed4e699" protocol=ttrpc version=3 Jan 14 01:17:11.080000 audit: BPF prog-id=138 op=LOAD Jan 14 01:17:11.080000 audit: BPF prog-id=139 op=LOAD Jan 14 01:17:11.080000 audit[2979]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2966 pid=2979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430353861306362346432656137636365623532383866653931653434 Jan 14 01:17:11.080000 audit: BPF prog-id=139 op=UNLOAD Jan 14 01:17:11.080000 audit[2979]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=2979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430353861306362346432656137636365623532383866653931653434 Jan 14 01:17:11.080000 audit: BPF prog-id=140 op=LOAD Jan 14 01:17:11.080000 audit[2979]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2966 pid=2979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430353861306362346432656137636365623532383866653931653434 Jan 14 01:17:11.080000 audit: BPF prog-id=141 op=LOAD Jan 14 01:17:11.080000 audit[2979]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2966 pid=2979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430353861306362346432656137636365623532383866653931653434 Jan 14 01:17:11.081000 audit: BPF prog-id=141 op=UNLOAD Jan 14 01:17:11.081000 audit[2979]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=2979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.081000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430353861306362346432656137636365623532383866653931653434 Jan 14 01:17:11.081000 audit: BPF prog-id=140 op=UNLOAD Jan 14 01:17:11.081000 audit[2979]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=2979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.081000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430353861306362346432656137636365623532383866653931653434 Jan 14 01:17:11.081000 audit: BPF prog-id=142 op=LOAD Jan 14 01:17:11.081000 audit[2979]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2966 pid=2979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.081000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430353861306362346432656137636365623532383866653931653434 Jan 14 01:17:11.101271 systemd[1]: Started cri-containerd-c94082e0cc08bd79b2b3cf0402fef2087fd6a6f7ad2b939a36c1b08c45a61873.scope - libcontainer container c94082e0cc08bd79b2b3cf0402fef2087fd6a6f7ad2b939a36c1b08c45a61873. Jan 14 01:17:11.136899 containerd[1677]: time="2026-01-14T01:17:11.135867370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-rblzk,Uid:1a843e5a-8b6c-4213-b568-9d2f828dc754,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d\"" Jan 14 01:17:11.137939 containerd[1677]: time="2026-01-14T01:17:11.137908671Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 14 01:17:11.157000 audit: BPF prog-id=143 op=LOAD Jan 14 01:17:11.157000 audit[2999]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2921 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339343038326530636330386264373962326233636630343032666566 Jan 14 01:17:11.157000 audit: BPF prog-id=144 op=LOAD Jan 14 01:17:11.157000 audit[2999]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2921 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339343038326530636330386264373962326233636630343032666566 Jan 14 01:17:11.158000 audit: BPF prog-id=144 op=UNLOAD Jan 14 01:17:11.158000 audit[2999]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2921 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.158000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339343038326530636330386264373962326233636630343032666566 Jan 14 01:17:11.158000 audit: BPF prog-id=143 op=UNLOAD Jan 14 01:17:11.158000 audit[2999]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2921 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.158000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339343038326530636330386264373962326233636630343032666566 Jan 14 01:17:11.158000 audit: BPF prog-id=145 op=LOAD Jan 14 01:17:11.158000 audit[2999]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2921 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.158000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339343038326530636330386264373962326233636630343032666566 Jan 14 01:17:11.181258 containerd[1677]: time="2026-01-14T01:17:11.181214149Z" level=info msg="StartContainer for \"c94082e0cc08bd79b2b3cf0402fef2087fd6a6f7ad2b939a36c1b08c45a61873\" returns successfully" Jan 14 01:17:11.319000 audit[3073]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3073 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.319000 audit[3074]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.319000 audit[3073]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffff8f3dfe0 a2=0 a3=7ffff8f3dfcc items=0 ppid=3013 pid=3073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.319000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:17:11.319000 audit[3074]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdc8cca500 a2=0 a3=7ffdc8cca4ec items=0 ppid=3013 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.319000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:17:11.323000 audit[3075]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.323000 audit[3075]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc526d04f0 a2=0 a3=7ffc526d04dc items=0 ppid=3013 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.323000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:17:11.325000 audit[3076]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=3076 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.325000 audit[3076]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee27d7700 a2=0 a3=7ffee27d76ec items=0 ppid=3013 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.325000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:17:11.328000 audit[3079]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=3079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.328000 audit[3079]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc2123b890 a2=0 a3=7ffc2123b87c items=0 ppid=3013 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.328000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:17:11.329000 audit[3078]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.329000 audit[3078]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffec41ec290 a2=0 a3=7ffec41ec27c items=0 ppid=3013 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.329000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:17:11.434000 audit[3082]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3082 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.434000 audit[3082]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffecb36d730 a2=0 a3=7ffecb36d71c items=0 ppid=3013 pid=3082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.434000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:17:11.442000 audit[3084]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3084 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.442000 audit[3084]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffdc679c7c0 a2=0 a3=7ffdc679c7ac items=0 ppid=3013 pid=3084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.442000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 14 01:17:11.451000 audit[3087]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.451000 audit[3087]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fffba36e5f0 a2=0 a3=7fffba36e5dc items=0 ppid=3013 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.451000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 14 01:17:11.454000 audit[3088]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3088 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.454000 audit[3088]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff56ac99e0 a2=0 a3=7fff56ac99cc items=0 ppid=3013 pid=3088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.454000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:17:11.462000 audit[3090]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3090 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.462000 audit[3090]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe1e9c97c0 a2=0 a3=7ffe1e9c97ac items=0 ppid=3013 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.462000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:17:11.465000 audit[3091]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3091 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.465000 audit[3091]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc35763da0 a2=0 a3=7ffc35763d8c items=0 ppid=3013 pid=3091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.465000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:17:11.473000 audit[3093]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.473000 audit[3093]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff5023c760 a2=0 a3=7fff5023c74c items=0 ppid=3013 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.473000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 01:17:11.482000 audit[3096]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.482000 audit[3096]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe782ca3e0 a2=0 a3=7ffe782ca3cc items=0 ppid=3013 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.482000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 14 01:17:11.486000 audit[3097]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.486000 audit[3097]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcee6341e0 a2=0 a3=7ffcee6341cc items=0 ppid=3013 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.486000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:17:11.494000 audit[3099]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.494000 audit[3099]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc87f673a0 a2=0 a3=7ffc87f6738c items=0 ppid=3013 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.494000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:17:11.496000 audit[3100]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3100 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.496000 audit[3100]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe21c1dde0 a2=0 a3=7ffe21c1ddcc items=0 ppid=3013 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.496000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:17:11.503000 audit[3102]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3102 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.503000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc2d1e9e50 a2=0 a3=7ffc2d1e9e3c items=0 ppid=3013 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.503000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:17:11.526000 audit[3105]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3105 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.526000 audit[3105]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffef31e66f0 a2=0 a3=7ffef31e66dc items=0 ppid=3013 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.526000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:17:11.538000 audit[3108]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3108 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.538000 audit[3108]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc3fd8fc10 a2=0 a3=7ffc3fd8fbfc items=0 ppid=3013 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.538000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 01:17:11.542000 audit[3109]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.542000 audit[3109]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffbb587fb0 a2=0 a3=7fffbb587f9c items=0 ppid=3013 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.542000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:17:11.548000 audit[3111]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.548000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fffa7e6b110 a2=0 a3=7fffa7e6b0fc items=0 ppid=3013 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.548000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:17:11.552000 audit[3114]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.552000 audit[3114]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffeb21218c0 a2=0 a3=7ffeb21218ac items=0 ppid=3013 pid=3114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.552000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:17:11.554000 audit[3115]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.554000 audit[3115]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc53676020 a2=0 a3=7ffc5367600c items=0 ppid=3013 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.554000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:17:11.560000 audit[3117]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:17:11.560000 audit[3117]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffe75032f10 a2=0 a3=7ffe75032efc items=0 ppid=3013 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.560000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:17:11.601000 audit[3123]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:11.601000 audit[3123]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffedaa19eb0 a2=0 a3=7ffedaa19e9c items=0 ppid=3013 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.601000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:11.611000 audit[3123]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:11.611000 audit[3123]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffedaa19eb0 a2=0 a3=7ffedaa19e9c items=0 ppid=3013 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.611000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:11.615000 audit[3128]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.615000 audit[3128]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fff328f8a10 a2=0 a3=7fff328f89fc items=0 ppid=3013 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.615000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:17:11.621000 audit[3130]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.621000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffc839c8430 a2=0 a3=7ffc839c841c items=0 ppid=3013 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.621000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 14 01:17:11.628000 audit[3133]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3133 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.628000 audit[3133]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd61c458f0 a2=0 a3=7ffd61c458dc items=0 ppid=3013 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.628000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 14 01:17:11.629000 audit[3134]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3134 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.629000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc09eb4bf0 a2=0 a3=7ffc09eb4bdc items=0 ppid=3013 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.629000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:17:11.632000 audit[3136]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3136 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.632000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd2c00f540 a2=0 a3=7ffd2c00f52c items=0 ppid=3013 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.632000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:17:11.634000 audit[3137]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3137 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.634000 audit[3137]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd8ccb2a30 a2=0 a3=7ffd8ccb2a1c items=0 ppid=3013 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.634000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:17:11.637000 audit[3139]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3139 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.637000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe76e987a0 a2=0 a3=7ffe76e9878c items=0 ppid=3013 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.637000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 14 01:17:11.642000 audit[3142]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3142 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.642000 audit[3142]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffd16a23290 a2=0 a3=7ffd16a2327c items=0 ppid=3013 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.642000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 01:17:11.644000 audit[3143]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3143 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.644000 audit[3143]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffde5fda940 a2=0 a3=7ffde5fda92c items=0 ppid=3013 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.644000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:17:11.647000 audit[3145]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.647000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffe401e600 a2=0 a3=7fffe401e5ec items=0 ppid=3013 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.647000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:17:11.648000 audit[3146]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3146 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.648000 audit[3146]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdb6d5d510 a2=0 a3=7ffdb6d5d4fc items=0 ppid=3013 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.648000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:17:11.653000 audit[3148]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.653000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe3c98a960 a2=0 a3=7ffe3c98a94c items=0 ppid=3013 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.653000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:17:11.658000 audit[3151]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.658000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd92434590 a2=0 a3=7ffd9243457c items=0 ppid=3013 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.658000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 01:17:11.662000 audit[3154]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3154 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.662000 audit[3154]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdbeaad8d0 a2=0 a3=7ffdbeaad8bc items=0 ppid=3013 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.662000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 14 01:17:11.663000 audit[3155]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3155 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.663000 audit[3155]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd087f22a0 a2=0 a3=7ffd087f228c items=0 ppid=3013 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.663000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:17:11.666000 audit[3157]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3157 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.666000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff9ab590a0 a2=0 a3=7fff9ab5908c items=0 ppid=3013 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.666000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:17:11.670000 audit[3160]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3160 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.670000 audit[3160]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd3a423c60 a2=0 a3=7ffd3a423c4c items=0 ppid=3013 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.670000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:17:11.671000 audit[3161]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3161 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.671000 audit[3161]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc0b3f6c00 a2=0 a3=7ffc0b3f6bec items=0 ppid=3013 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.671000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:17:11.673000 audit[3163]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3163 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.673000 audit[3163]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffdb5ce16d0 a2=0 a3=7ffdb5ce16bc items=0 ppid=3013 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.673000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:17:11.675000 audit[3164]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3164 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.675000 audit[3164]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff28686040 a2=0 a3=7fff2868602c items=0 ppid=3013 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.675000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:17:11.679000 audit[3166]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3166 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.679000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffe33c42d0 a2=0 a3=7fffe33c42bc items=0 ppid=3013 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.679000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:17:11.684000 audit[3169]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3169 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:17:11.684000 audit[3169]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffe0fa873e0 a2=0 a3=7ffe0fa873cc items=0 ppid=3013 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.684000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:17:11.689000 audit[3171]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:17:11.689000 audit[3171]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffc6c3fd0e0 a2=0 a3=7ffc6c3fd0cc items=0 ppid=3013 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.689000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:11.689000 audit[3171]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:17:11.689000 audit[3171]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffc6c3fd0e0 a2=0 a3=7ffc6c3fd0cc items=0 ppid=3013 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:11.689000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:12.048475 kubelet[2863]: I0114 01:17:12.048187 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qkkjf" podStartSLOduration=2.04816485 podStartE2EDuration="2.04816485s" podCreationTimestamp="2026-01-14 01:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:17:12.04804672 +0000 UTC m=+7.182966013" watchObservedRunningTime="2026-01-14 01:17:12.04816485 +0000 UTC m=+7.183084143" Jan 14 01:17:13.095061 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3316821614.mount: Deactivated successfully. Jan 14 01:17:13.606706 containerd[1677]: time="2026-01-14T01:17:13.606655629Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:13.607802 containerd[1677]: time="2026-01-14T01:17:13.607650629Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 14 01:17:13.608501 containerd[1677]: time="2026-01-14T01:17:13.608474580Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:13.610173 containerd[1677]: time="2026-01-14T01:17:13.610146230Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:13.610718 containerd[1677]: time="2026-01-14T01:17:13.610684471Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.47274784s" Jan 14 01:17:13.610777 containerd[1677]: time="2026-01-14T01:17:13.610766751Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 14 01:17:13.614116 containerd[1677]: time="2026-01-14T01:17:13.614092202Z" level=info msg="CreateContainer within sandbox \"4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 14 01:17:13.623440 containerd[1677]: time="2026-01-14T01:17:13.623410716Z" level=info msg="Container b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:13.625755 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount370709814.mount: Deactivated successfully. Jan 14 01:17:13.630956 containerd[1677]: time="2026-01-14T01:17:13.630909479Z" level=info msg="CreateContainer within sandbox \"4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5\"" Jan 14 01:17:13.631738 containerd[1677]: time="2026-01-14T01:17:13.631634979Z" level=info msg="StartContainer for \"b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5\"" Jan 14 01:17:13.632883 containerd[1677]: time="2026-01-14T01:17:13.632857290Z" level=info msg="connecting to shim b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5" address="unix:///run/containerd/s/ae10d5e669d8a94f38783b7b4a0ca9944c153ccc850bf49e719b33f66c2b4d47" protocol=ttrpc version=3 Jan 14 01:17:13.653156 systemd[1]: Started cri-containerd-b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5.scope - libcontainer container b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5. Jan 14 01:17:13.663000 audit: BPF prog-id=146 op=LOAD Jan 14 01:17:13.664000 audit: BPF prog-id=147 op=LOAD Jan 14 01:17:13.664000 audit[3180]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2966 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:13.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239336164326230366464633537643739366534363339623431346439 Jan 14 01:17:13.664000 audit: BPF prog-id=147 op=UNLOAD Jan 14 01:17:13.664000 audit[3180]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:13.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239336164326230366464633537643739366534363339623431346439 Jan 14 01:17:13.664000 audit: BPF prog-id=148 op=LOAD Jan 14 01:17:13.664000 audit[3180]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2966 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:13.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239336164326230366464633537643739366534363339623431346439 Jan 14 01:17:13.664000 audit: BPF prog-id=149 op=LOAD Jan 14 01:17:13.664000 audit[3180]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2966 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:13.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239336164326230366464633537643739366534363339623431346439 Jan 14 01:17:13.664000 audit: BPF prog-id=149 op=UNLOAD Jan 14 01:17:13.664000 audit[3180]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:13.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239336164326230366464633537643739366534363339623431346439 Jan 14 01:17:13.664000 audit: BPF prog-id=148 op=UNLOAD Jan 14 01:17:13.664000 audit[3180]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:13.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239336164326230366464633537643739366534363339623431346439 Jan 14 01:17:13.664000 audit: BPF prog-id=150 op=LOAD Jan 14 01:17:13.664000 audit[3180]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2966 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:13.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239336164326230366464633537643739366534363339623431346439 Jan 14 01:17:13.685202 containerd[1677]: time="2026-01-14T01:17:13.685029282Z" level=info msg="StartContainer for \"b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5\" returns successfully" Jan 14 01:17:14.050867 kubelet[2863]: I0114 01:17:14.050754 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-rblzk" podStartSLOduration=1.575970403 podStartE2EDuration="4.050740334s" podCreationTimestamp="2026-01-14 01:17:10 +0000 UTC" firstStartedPulling="2026-01-14 01:17:11.13683099 +0000 UTC m=+6.271750243" lastFinishedPulling="2026-01-14 01:17:13.611600921 +0000 UTC m=+8.746520174" observedRunningTime="2026-01-14 01:17:14.050418514 +0000 UTC m=+9.185337777" watchObservedRunningTime="2026-01-14 01:17:14.050740334 +0000 UTC m=+9.185659587" Jan 14 01:17:14.997036 update_engine[1658]: I20260114 01:17:14.996150 1658 update_attempter.cc:509] Updating boot flags... Jan 14 01:17:19.588545 sudo[1911]: pam_unix(sudo:session): session closed for user root Jan 14 01:17:19.587000 audit[1911]: USER_END pid=1911 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:17:19.595186 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 14 01:17:19.595259 kernel: audit: type=1106 audit(1768353439.587:524): pid=1911 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:17:19.599378 kernel: audit: type=1104 audit(1768353439.587:525): pid=1911 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:17:19.587000 audit[1911]: CRED_DISP pid=1911 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:17:19.710034 sshd[1910]: Connection closed by 68.220.241.50 port 48232 Jan 14 01:17:19.711158 sshd-session[1906]: pam_unix(sshd:session): session closed for user core Jan 14 01:17:19.722183 kernel: audit: type=1106 audit(1768353439.713:526): pid=1906 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:17:19.713000 audit[1906]: USER_END pid=1906 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:17:19.727432 systemd[1]: sshd@6-77.42.79.167:22-68.220.241.50:48232.service: Deactivated successfully. Jan 14 01:17:19.731730 systemd[1]: session-8.scope: Deactivated successfully. Jan 14 01:17:19.713000 audit[1906]: CRED_DISP pid=1906 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:17:19.737213 kernel: audit: type=1104 audit(1768353439.713:527): pid=1906 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:17:19.737589 systemd[1]: session-8.scope: Consumed 5.373s CPU time, 234.1M memory peak. Jan 14 01:17:19.740700 systemd-logind[1657]: Session 8 logged out. Waiting for processes to exit. Jan 14 01:17:19.726000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.79.167:22-68.220.241.50:48232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:17:19.744396 systemd-logind[1657]: Removed session 8. Jan 14 01:17:19.748051 kernel: audit: type=1131 audit(1768353439.726:528): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.79.167:22-68.220.241.50:48232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:17:20.341000 audit[3283]: NETFILTER_CFG table=filter:105 family=2 entries=14 op=nft_register_rule pid=3283 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:20.347074 kernel: audit: type=1325 audit(1768353440.341:529): table=filter:105 family=2 entries=14 op=nft_register_rule pid=3283 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:20.341000 audit[3283]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fffe1dafb20 a2=0 a3=7fffe1dafb0c items=0 ppid=3013 pid=3283 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:20.356045 kernel: audit: type=1300 audit(1768353440.341:529): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fffe1dafb20 a2=0 a3=7fffe1dafb0c items=0 ppid=3013 pid=3283 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:20.341000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:20.362028 kernel: audit: type=1327 audit(1768353440.341:529): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:20.356000 audit[3283]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3283 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:20.367049 kernel: audit: type=1325 audit(1768353440.356:530): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3283 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:20.356000 audit[3283]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffe1dafb20 a2=0 a3=0 items=0 ppid=3013 pid=3283 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:20.375083 kernel: audit: type=1300 audit(1768353440.356:530): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffe1dafb20 a2=0 a3=0 items=0 ppid=3013 pid=3283 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:20.356000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:20.387000 audit[3285]: NETFILTER_CFG table=filter:107 family=2 entries=15 op=nft_register_rule pid=3285 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:20.387000 audit[3285]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffd03dc13b0 a2=0 a3=7ffd03dc139c items=0 ppid=3013 pid=3285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:20.387000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:20.391000 audit[3285]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3285 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:20.391000 audit[3285]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd03dc13b0 a2=0 a3=0 items=0 ppid=3013 pid=3285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:20.391000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:21.930000 audit[3287]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3287 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:21.930000 audit[3287]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffdf2d2bd20 a2=0 a3=7ffdf2d2bd0c items=0 ppid=3013 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:21.930000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:21.934000 audit[3287]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3287 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:21.934000 audit[3287]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdf2d2bd20 a2=0 a3=0 items=0 ppid=3013 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:21.934000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:22.950000 audit[3289]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:22.950000 audit[3289]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fffeb5ded50 a2=0 a3=7fffeb5ded3c items=0 ppid=3013 pid=3289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:22.950000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:22.955000 audit[3289]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:22.955000 audit[3289]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffeb5ded50 a2=0 a3=0 items=0 ppid=3013 pid=3289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:22.955000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:23.717000 audit[3291]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3291 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:23.717000 audit[3291]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffeae82ee40 a2=0 a3=7ffeae82ee2c items=0 ppid=3013 pid=3291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:23.717000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:23.725000 audit[3291]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3291 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:23.725000 audit[3291]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffeae82ee40 a2=0 a3=0 items=0 ppid=3013 pid=3291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:23.725000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:23.745518 systemd[1]: Created slice kubepods-besteffort-podf2c7231f_6c2c_49db_a4c0_9790b4703c2e.slice - libcontainer container kubepods-besteffort-podf2c7231f_6c2c_49db_a4c0_9790b4703c2e.slice. Jan 14 01:17:23.814028 kubelet[2863]: I0114 01:17:23.813907 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9wm4\" (UniqueName: \"kubernetes.io/projected/f2c7231f-6c2c-49db-a4c0-9790b4703c2e-kube-api-access-b9wm4\") pod \"calico-typha-5b58b496f6-r485v\" (UID: \"f2c7231f-6c2c-49db-a4c0-9790b4703c2e\") " pod="calico-system/calico-typha-5b58b496f6-r485v" Jan 14 01:17:23.814028 kubelet[2863]: I0114 01:17:23.813947 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f2c7231f-6c2c-49db-a4c0-9790b4703c2e-typha-certs\") pod \"calico-typha-5b58b496f6-r485v\" (UID: \"f2c7231f-6c2c-49db-a4c0-9790b4703c2e\") " pod="calico-system/calico-typha-5b58b496f6-r485v" Jan 14 01:17:23.814028 kubelet[2863]: I0114 01:17:23.813962 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2c7231f-6c2c-49db-a4c0-9790b4703c2e-tigera-ca-bundle\") pod \"calico-typha-5b58b496f6-r485v\" (UID: \"f2c7231f-6c2c-49db-a4c0-9790b4703c2e\") " pod="calico-system/calico-typha-5b58b496f6-r485v" Jan 14 01:17:24.039391 systemd[1]: Created slice kubepods-besteffort-pod50230d43_f56f_4ab7_984a_0ce0dc7b687f.slice - libcontainer container kubepods-besteffort-pod50230d43_f56f_4ab7_984a_0ce0dc7b687f.slice. Jan 14 01:17:24.052043 containerd[1677]: time="2026-01-14T01:17:24.051968548Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b58b496f6-r485v,Uid:f2c7231f-6c2c-49db-a4c0-9790b4703c2e,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:24.088073 containerd[1677]: time="2026-01-14T01:17:24.087516785Z" level=info msg="connecting to shim 6292c22dbb43ac65c524b65dc25a399076df0c97508fa87156dd892f09685499" address="unix:///run/containerd/s/f04661a35290d70b341b1e18554acac5b542ccfc1fad6c5767b181ae0e968a46" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:24.116143 kubelet[2863]: I0114 01:17:24.116076 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50230d43-f56f-4ab7-984a-0ce0dc7b687f-tigera-ca-bundle\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116143 kubelet[2863]: I0114 01:17:24.116122 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-cni-log-dir\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116331 kubelet[2863]: I0114 01:17:24.116152 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-flexvol-driver-host\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116331 kubelet[2863]: I0114 01:17:24.116173 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-cni-bin-dir\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116331 kubelet[2863]: I0114 01:17:24.116192 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-policysync\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116331 kubelet[2863]: I0114 01:17:24.116208 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-var-lib-calico\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116331 kubelet[2863]: I0114 01:17:24.116245 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-xtables-lock\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116499 kubelet[2863]: I0114 01:17:24.116262 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rcj2\" (UniqueName: \"kubernetes.io/projected/50230d43-f56f-4ab7-984a-0ce0dc7b687f-kube-api-access-9rcj2\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116499 kubelet[2863]: I0114 01:17:24.116282 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-cni-net-dir\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116499 kubelet[2863]: I0114 01:17:24.116301 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/50230d43-f56f-4ab7-984a-0ce0dc7b687f-node-certs\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116499 kubelet[2863]: I0114 01:17:24.116317 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-var-run-calico\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.116499 kubelet[2863]: I0114 01:17:24.116335 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50230d43-f56f-4ab7-984a-0ce0dc7b687f-lib-modules\") pod \"calico-node-qwbj7\" (UID: \"50230d43-f56f-4ab7-984a-0ce0dc7b687f\") " pod="calico-system/calico-node-qwbj7" Jan 14 01:17:24.127323 systemd[1]: Started cri-containerd-6292c22dbb43ac65c524b65dc25a399076df0c97508fa87156dd892f09685499.scope - libcontainer container 6292c22dbb43ac65c524b65dc25a399076df0c97508fa87156dd892f09685499. Jan 14 01:17:24.142000 audit: BPF prog-id=151 op=LOAD Jan 14 01:17:24.143000 audit: BPF prog-id=152 op=LOAD Jan 14 01:17:24.143000 audit[3314]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=3303 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.143000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3632393263323264626234336163363563353234623635646332356133 Jan 14 01:17:24.143000 audit: BPF prog-id=152 op=UNLOAD Jan 14 01:17:24.143000 audit[3314]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3303 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.143000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3632393263323264626234336163363563353234623635646332356133 Jan 14 01:17:24.144000 audit: BPF prog-id=153 op=LOAD Jan 14 01:17:24.144000 audit[3314]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3303 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.144000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3632393263323264626234336163363563353234623635646332356133 Jan 14 01:17:24.144000 audit: BPF prog-id=154 op=LOAD Jan 14 01:17:24.144000 audit[3314]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3303 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.144000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3632393263323264626234336163363563353234623635646332356133 Jan 14 01:17:24.144000 audit: BPF prog-id=154 op=UNLOAD Jan 14 01:17:24.144000 audit[3314]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3303 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.144000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3632393263323264626234336163363563353234623635646332356133 Jan 14 01:17:24.144000 audit: BPF prog-id=153 op=UNLOAD Jan 14 01:17:24.144000 audit[3314]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3303 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.144000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3632393263323264626234336163363563353234623635646332356133 Jan 14 01:17:24.144000 audit: BPF prog-id=155 op=LOAD Jan 14 01:17:24.144000 audit[3314]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3303 pid=3314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.144000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3632393263323264626234336163363563353234623635646332356133 Jan 14 01:17:24.204739 kubelet[2863]: E0114 01:17:24.204261 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:24.217690 containerd[1677]: time="2026-01-14T01:17:24.217545135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b58b496f6-r485v,Uid:f2c7231f-6c2c-49db-a4c0-9790b4703c2e,Namespace:calico-system,Attempt:0,} returns sandbox id \"6292c22dbb43ac65c524b65dc25a399076df0c97508fa87156dd892f09685499\"" Jan 14 01:17:24.220166 kubelet[2863]: E0114 01:17:24.220048 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.220166 kubelet[2863]: W0114 01:17:24.220089 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.220166 kubelet[2863]: E0114 01:17:24.220110 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.220557 kubelet[2863]: E0114 01:17:24.220446 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.220557 kubelet[2863]: W0114 01:17:24.220457 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.220557 kubelet[2863]: E0114 01:17:24.220467 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.221734 kubelet[2863]: E0114 01:17:24.221718 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.221734 kubelet[2863]: W0114 01:17:24.221730 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.221816 kubelet[2863]: E0114 01:17:24.221741 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.221929 kubelet[2863]: E0114 01:17:24.221904 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.221929 kubelet[2863]: W0114 01:17:24.221914 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.221929 kubelet[2863]: E0114 01:17:24.221921 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.222184 containerd[1677]: time="2026-01-14T01:17:24.222159083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 14 01:17:24.222479 kubelet[2863]: E0114 01:17:24.222278 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.222479 kubelet[2863]: W0114 01:17:24.222289 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.222479 kubelet[2863]: E0114 01:17:24.222297 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.223292 kubelet[2863]: E0114 01:17:24.223040 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.223292 kubelet[2863]: W0114 01:17:24.223050 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.223292 kubelet[2863]: E0114 01:17:24.223058 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.223292 kubelet[2863]: E0114 01:17:24.223284 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.223292 kubelet[2863]: W0114 01:17:24.223291 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.223292 kubelet[2863]: E0114 01:17:24.223297 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.224022 kubelet[2863]: E0114 01:17:24.223860 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.224022 kubelet[2863]: W0114 01:17:24.223874 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.224163 kubelet[2863]: E0114 01:17:24.223884 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.224779 kubelet[2863]: E0114 01:17:24.224535 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.224779 kubelet[2863]: W0114 01:17:24.224749 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.224779 kubelet[2863]: E0114 01:17:24.224758 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.226350 kubelet[2863]: E0114 01:17:24.226300 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.227234 kubelet[2863]: W0114 01:17:24.226909 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.227234 kubelet[2863]: E0114 01:17:24.227070 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.227575 kubelet[2863]: E0114 01:17:24.227469 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.227940 kubelet[2863]: W0114 01:17:24.227854 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.227940 kubelet[2863]: E0114 01:17:24.227918 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.229716 kubelet[2863]: E0114 01:17:24.229661 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.229716 kubelet[2863]: W0114 01:17:24.229671 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.229716 kubelet[2863]: E0114 01:17:24.229681 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.230124 kubelet[2863]: E0114 01:17:24.230077 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.230124 kubelet[2863]: W0114 01:17:24.230087 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.230124 kubelet[2863]: E0114 01:17:24.230094 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.231296 kubelet[2863]: E0114 01:17:24.231249 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.231296 kubelet[2863]: W0114 01:17:24.231260 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.231296 kubelet[2863]: E0114 01:17:24.231269 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.236035 kubelet[2863]: E0114 01:17:24.234419 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.236035 kubelet[2863]: W0114 01:17:24.234456 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.236035 kubelet[2863]: E0114 01:17:24.234470 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.238416 kubelet[2863]: E0114 01:17:24.237671 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.238416 kubelet[2863]: W0114 01:17:24.237690 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.238416 kubelet[2863]: E0114 01:17:24.237707 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.240736 kubelet[2863]: E0114 01:17:24.240675 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.240736 kubelet[2863]: W0114 01:17:24.240693 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.240736 kubelet[2863]: E0114 01:17:24.240709 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.242220 kubelet[2863]: E0114 01:17:24.242207 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.242287 kubelet[2863]: W0114 01:17:24.242278 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.242329 kubelet[2863]: E0114 01:17:24.242320 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.299126 kubelet[2863]: E0114 01:17:24.298408 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.299499 kubelet[2863]: W0114 01:17:24.299249 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.299499 kubelet[2863]: E0114 01:17:24.299297 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.300511 kubelet[2863]: E0114 01:17:24.299747 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.300511 kubelet[2863]: W0114 01:17:24.299759 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.300511 kubelet[2863]: E0114 01:17:24.299768 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.300511 kubelet[2863]: E0114 01:17:24.299944 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.300511 kubelet[2863]: W0114 01:17:24.299950 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.300511 kubelet[2863]: E0114 01:17:24.299957 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.300511 kubelet[2863]: E0114 01:17:24.300260 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.300511 kubelet[2863]: W0114 01:17:24.300270 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.300511 kubelet[2863]: E0114 01:17:24.300281 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.300737 kubelet[2863]: E0114 01:17:24.300572 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.300737 kubelet[2863]: W0114 01:17:24.300582 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.300737 kubelet[2863]: E0114 01:17:24.300592 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.300980 kubelet[2863]: E0114 01:17:24.300870 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.300980 kubelet[2863]: W0114 01:17:24.300909 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.300980 kubelet[2863]: E0114 01:17:24.300919 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.302058 kubelet[2863]: E0114 01:17:24.301996 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.302058 kubelet[2863]: W0114 01:17:24.302036 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.302058 kubelet[2863]: E0114 01:17:24.302047 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.302497 kubelet[2863]: E0114 01:17:24.302278 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.302497 kubelet[2863]: W0114 01:17:24.302290 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.302497 kubelet[2863]: E0114 01:17:24.302300 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.302582 kubelet[2863]: E0114 01:17:24.302550 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.302582 kubelet[2863]: W0114 01:17:24.302559 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.302582 kubelet[2863]: E0114 01:17:24.302568 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.302837 kubelet[2863]: E0114 01:17:24.302822 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.302837 kubelet[2863]: W0114 01:17:24.302832 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.303036 kubelet[2863]: E0114 01:17:24.302847 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.303105 kubelet[2863]: E0114 01:17:24.303069 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.303105 kubelet[2863]: W0114 01:17:24.303079 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.303105 kubelet[2863]: E0114 01:17:24.303086 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.303350 kubelet[2863]: E0114 01:17:24.303322 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.303350 kubelet[2863]: W0114 01:17:24.303333 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.303350 kubelet[2863]: E0114 01:17:24.303339 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.303737 kubelet[2863]: E0114 01:17:24.303710 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.303737 kubelet[2863]: W0114 01:17:24.303721 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.303737 kubelet[2863]: E0114 01:17:24.303729 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.303936 kubelet[2863]: E0114 01:17:24.303911 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.303936 kubelet[2863]: W0114 01:17:24.303921 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.303936 kubelet[2863]: E0114 01:17:24.303928 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.304239 kubelet[2863]: E0114 01:17:24.304216 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.304239 kubelet[2863]: W0114 01:17:24.304228 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.304296 kubelet[2863]: E0114 01:17:24.304250 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.304532 kubelet[2863]: E0114 01:17:24.304510 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.304532 kubelet[2863]: W0114 01:17:24.304520 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.304532 kubelet[2863]: E0114 01:17:24.304527 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.304799 kubelet[2863]: E0114 01:17:24.304775 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.304799 kubelet[2863]: W0114 01:17:24.304785 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.304799 kubelet[2863]: E0114 01:17:24.304792 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.304987 kubelet[2863]: E0114 01:17:24.304966 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.304987 kubelet[2863]: W0114 01:17:24.304976 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.304987 kubelet[2863]: E0114 01:17:24.304982 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.305274 kubelet[2863]: E0114 01:17:24.305249 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.305274 kubelet[2863]: W0114 01:17:24.305262 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.305274 kubelet[2863]: E0114 01:17:24.305271 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.305686 kubelet[2863]: E0114 01:17:24.305536 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.305686 kubelet[2863]: W0114 01:17:24.305559 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.305686 kubelet[2863]: E0114 01:17:24.305588 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.320077 kubelet[2863]: E0114 01:17:24.320037 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.320077 kubelet[2863]: W0114 01:17:24.320058 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.320077 kubelet[2863]: E0114 01:17:24.320075 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.320344 kubelet[2863]: I0114 01:17:24.320115 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7g5\" (UniqueName: \"kubernetes.io/projected/f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b-kube-api-access-gn7g5\") pod \"csi-node-driver-pj96q\" (UID: \"f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b\") " pod="calico-system/csi-node-driver-pj96q" Jan 14 01:17:24.320344 kubelet[2863]: E0114 01:17:24.320320 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.320344 kubelet[2863]: W0114 01:17:24.320327 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.320344 kubelet[2863]: E0114 01:17:24.320334 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.320529 kubelet[2863]: I0114 01:17:24.320356 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b-socket-dir\") pod \"csi-node-driver-pj96q\" (UID: \"f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b\") " pod="calico-system/csi-node-driver-pj96q" Jan 14 01:17:24.320594 kubelet[2863]: E0114 01:17:24.320539 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.320594 kubelet[2863]: W0114 01:17:24.320546 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.320594 kubelet[2863]: E0114 01:17:24.320552 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.320594 kubelet[2863]: I0114 01:17:24.320568 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b-kubelet-dir\") pod \"csi-node-driver-pj96q\" (UID: \"f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b\") " pod="calico-system/csi-node-driver-pj96q" Jan 14 01:17:24.320866 kubelet[2863]: E0114 01:17:24.320783 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.320866 kubelet[2863]: W0114 01:17:24.320790 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.320866 kubelet[2863]: E0114 01:17:24.320798 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.320866 kubelet[2863]: I0114 01:17:24.320811 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b-varrun\") pod \"csi-node-driver-pj96q\" (UID: \"f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b\") " pod="calico-system/csi-node-driver-pj96q" Jan 14 01:17:24.321061 kubelet[2863]: E0114 01:17:24.321023 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.321061 kubelet[2863]: W0114 01:17:24.321031 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.321061 kubelet[2863]: E0114 01:17:24.321037 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.321061 kubelet[2863]: I0114 01:17:24.321056 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b-registration-dir\") pod \"csi-node-driver-pj96q\" (UID: \"f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b\") " pod="calico-system/csi-node-driver-pj96q" Jan 14 01:17:24.321317 kubelet[2863]: E0114 01:17:24.321304 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.321317 kubelet[2863]: W0114 01:17:24.321315 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.321390 kubelet[2863]: E0114 01:17:24.321322 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.321608 kubelet[2863]: E0114 01:17:24.321563 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.321608 kubelet[2863]: W0114 01:17:24.321587 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.321608 kubelet[2863]: E0114 01:17:24.321620 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.321935 kubelet[2863]: E0114 01:17:24.321918 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.321935 kubelet[2863]: W0114 01:17:24.321928 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.321935 kubelet[2863]: E0114 01:17:24.321936 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.322170 kubelet[2863]: E0114 01:17:24.322147 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.322170 kubelet[2863]: W0114 01:17:24.322163 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.322170 kubelet[2863]: E0114 01:17:24.322170 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.322419 kubelet[2863]: E0114 01:17:24.322401 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.322419 kubelet[2863]: W0114 01:17:24.322412 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.322419 kubelet[2863]: E0114 01:17:24.322420 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.322700 kubelet[2863]: E0114 01:17:24.322677 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.322700 kubelet[2863]: W0114 01:17:24.322689 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.322700 kubelet[2863]: E0114 01:17:24.322698 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.322907 kubelet[2863]: E0114 01:17:24.322885 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.322907 kubelet[2863]: W0114 01:17:24.322894 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.322907 kubelet[2863]: E0114 01:17:24.322903 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.323207 kubelet[2863]: E0114 01:17:24.323186 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.323207 kubelet[2863]: W0114 01:17:24.323198 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.323207 kubelet[2863]: E0114 01:17:24.323205 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.323443 kubelet[2863]: E0114 01:17:24.323417 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.323443 kubelet[2863]: W0114 01:17:24.323425 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.323443 kubelet[2863]: E0114 01:17:24.323433 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.323626 kubelet[2863]: E0114 01:17:24.323598 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.323626 kubelet[2863]: W0114 01:17:24.323615 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.323626 kubelet[2863]: E0114 01:17:24.323621 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.350820 containerd[1677]: time="2026-01-14T01:17:24.350597668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qwbj7,Uid:50230d43-f56f-4ab7-984a-0ce0dc7b687f,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:24.369200 containerd[1677]: time="2026-01-14T01:17:24.369145878Z" level=info msg="connecting to shim 0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779" address="unix:///run/containerd/s/8b747dec2f230515377ffac264a521930b3d1651279be82ea28e9cb31f42d931" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:24.393272 systemd[1]: Started cri-containerd-0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779.scope - libcontainer container 0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779. Jan 14 01:17:24.414000 audit: BPF prog-id=156 op=LOAD Jan 14 01:17:24.414000 audit: BPF prog-id=157 op=LOAD Jan 14 01:17:24.414000 audit[3422]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3410 pid=3422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033313864376566303665306238306237343632386238356561366436 Jan 14 01:17:24.414000 audit: BPF prog-id=157 op=UNLOAD Jan 14 01:17:24.414000 audit[3422]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033313864376566303665306238306237343632386238356561366436 Jan 14 01:17:24.414000 audit: BPF prog-id=158 op=LOAD Jan 14 01:17:24.414000 audit[3422]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3410 pid=3422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033313864376566303665306238306237343632386238356561366436 Jan 14 01:17:24.414000 audit: BPF prog-id=159 op=LOAD Jan 14 01:17:24.414000 audit[3422]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3410 pid=3422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033313864376566303665306238306237343632386238356561366436 Jan 14 01:17:24.414000 audit: BPF prog-id=159 op=UNLOAD Jan 14 01:17:24.414000 audit[3422]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033313864376566303665306238306237343632386238356561366436 Jan 14 01:17:24.414000 audit: BPF prog-id=158 op=UNLOAD Jan 14 01:17:24.414000 audit[3422]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033313864376566303665306238306237343632386238356561366436 Jan 14 01:17:24.415000 audit: BPF prog-id=160 op=LOAD Jan 14 01:17:24.415000 audit[3422]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3410 pid=3422 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.415000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033313864376566303665306238306237343632386238356561366436 Jan 14 01:17:24.422411 kubelet[2863]: E0114 01:17:24.422360 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.422411 kubelet[2863]: W0114 01:17:24.422383 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.422668 kubelet[2863]: E0114 01:17:24.422425 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.423166 kubelet[2863]: E0114 01:17:24.423144 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.423166 kubelet[2863]: W0114 01:17:24.423159 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.423166 kubelet[2863]: E0114 01:17:24.423170 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.425119 kubelet[2863]: E0114 01:17:24.424954 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.425347 kubelet[2863]: W0114 01:17:24.425288 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.425347 kubelet[2863]: E0114 01:17:24.425305 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.426184 kubelet[2863]: E0114 01:17:24.426162 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.426184 kubelet[2863]: W0114 01:17:24.426176 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.426184 kubelet[2863]: E0114 01:17:24.426186 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.428040 kubelet[2863]: E0114 01:17:24.427700 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.428040 kubelet[2863]: W0114 01:17:24.427712 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.428040 kubelet[2863]: E0114 01:17:24.427725 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.428192 kubelet[2863]: E0114 01:17:24.428056 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.428192 kubelet[2863]: W0114 01:17:24.428065 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.428192 kubelet[2863]: E0114 01:17:24.428074 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.428345 kubelet[2863]: E0114 01:17:24.428323 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.428345 kubelet[2863]: W0114 01:17:24.428336 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.428420 kubelet[2863]: E0114 01:17:24.428346 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.428643 kubelet[2863]: E0114 01:17:24.428624 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.428682 kubelet[2863]: W0114 01:17:24.428655 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.428682 kubelet[2863]: E0114 01:17:24.428664 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.429043 kubelet[2863]: E0114 01:17:24.428998 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.429043 kubelet[2863]: W0114 01:17:24.429036 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.429131 kubelet[2863]: E0114 01:17:24.429045 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.429297 kubelet[2863]: E0114 01:17:24.429278 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.429297 kubelet[2863]: W0114 01:17:24.429290 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.429356 kubelet[2863]: E0114 01:17:24.429298 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.429536 kubelet[2863]: E0114 01:17:24.429517 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.429536 kubelet[2863]: W0114 01:17:24.429530 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.429629 kubelet[2863]: E0114 01:17:24.429539 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.429858 kubelet[2863]: E0114 01:17:24.429838 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.429858 kubelet[2863]: W0114 01:17:24.429851 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.429858 kubelet[2863]: E0114 01:17:24.429860 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.430196 kubelet[2863]: E0114 01:17:24.430170 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.430196 kubelet[2863]: W0114 01:17:24.430184 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.430281 kubelet[2863]: E0114 01:17:24.430212 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.430475 kubelet[2863]: E0114 01:17:24.430457 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.430475 kubelet[2863]: W0114 01:17:24.430470 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.430550 kubelet[2863]: E0114 01:17:24.430479 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.430850 kubelet[2863]: E0114 01:17:24.430745 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.430850 kubelet[2863]: W0114 01:17:24.430758 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.430850 kubelet[2863]: E0114 01:17:24.430767 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.431195 kubelet[2863]: E0114 01:17:24.431175 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.431195 kubelet[2863]: W0114 01:17:24.431190 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.431260 kubelet[2863]: E0114 01:17:24.431200 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.431499 kubelet[2863]: E0114 01:17:24.431473 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.431499 kubelet[2863]: W0114 01:17:24.431487 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.431499 kubelet[2863]: E0114 01:17:24.431496 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.432139 kubelet[2863]: E0114 01:17:24.431764 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.432139 kubelet[2863]: W0114 01:17:24.431771 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.432139 kubelet[2863]: E0114 01:17:24.431780 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.432325 kubelet[2863]: E0114 01:17:24.432301 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.432325 kubelet[2863]: W0114 01:17:24.432314 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.432325 kubelet[2863]: E0114 01:17:24.432324 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.432818 kubelet[2863]: E0114 01:17:24.432793 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.432818 kubelet[2863]: W0114 01:17:24.432807 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.432818 kubelet[2863]: E0114 01:17:24.432816 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.435082 kubelet[2863]: E0114 01:17:24.433341 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.435082 kubelet[2863]: W0114 01:17:24.433351 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.435082 kubelet[2863]: E0114 01:17:24.433386 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.435082 kubelet[2863]: E0114 01:17:24.433731 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.435082 kubelet[2863]: W0114 01:17:24.433740 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.435082 kubelet[2863]: E0114 01:17:24.433750 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.435082 kubelet[2863]: E0114 01:17:24.434051 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.435082 kubelet[2863]: W0114 01:17:24.434060 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.435082 kubelet[2863]: E0114 01:17:24.434070 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.435082 kubelet[2863]: E0114 01:17:24.434388 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.435416 kubelet[2863]: W0114 01:17:24.434417 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.435416 kubelet[2863]: E0114 01:17:24.434427 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.435416 kubelet[2863]: E0114 01:17:24.434765 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.435416 kubelet[2863]: W0114 01:17:24.434775 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.435416 kubelet[2863]: E0114 01:17:24.434784 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.446657 containerd[1677]: time="2026-01-14T01:17:24.446530627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qwbj7,Uid:50230d43-f56f-4ab7-984a-0ce0dc7b687f,Namespace:calico-system,Attempt:0,} returns sandbox id \"0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779\"" Jan 14 01:17:24.455349 kubelet[2863]: E0114 01:17:24.455295 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:24.455349 kubelet[2863]: W0114 01:17:24.455336 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:24.455529 kubelet[2863]: E0114 01:17:24.455361 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:24.750000 audit[3476]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3476 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:24.752869 kernel: kauditd_printk_skb: 69 callbacks suppressed Jan 14 01:17:24.752934 kernel: audit: type=1325 audit(1768353444.750:555): table=filter:115 family=2 entries=22 op=nft_register_rule pid=3476 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:24.750000 audit[3476]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe38215c40 a2=0 a3=7ffe38215c2c items=0 ppid=3013 pid=3476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.768489 kernel: audit: type=1300 audit(1768353444.750:555): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe38215c40 a2=0 a3=7ffe38215c2c items=0 ppid=3013 pid=3476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.750000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:24.763000 audit[3476]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3476 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:24.793765 kernel: audit: type=1327 audit(1768353444.750:555): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:24.793840 kernel: audit: type=1325 audit(1768353444.763:556): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3476 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:24.763000 audit[3476]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe38215c40 a2=0 a3=0 items=0 ppid=3013 pid=3476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.803994 kernel: audit: type=1300 audit(1768353444.763:556): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe38215c40 a2=0 a3=0 items=0 ppid=3013 pid=3476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:24.763000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:24.817089 kernel: audit: type=1327 audit(1768353444.763:556): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:25.950462 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1051812501.mount: Deactivated successfully. Jan 14 01:17:25.990842 kubelet[2863]: E0114 01:17:25.990766 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:27.593786 containerd[1677]: time="2026-01-14T01:17:27.593717971Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:27.595284 containerd[1677]: time="2026-01-14T01:17:27.595143818Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 14 01:17:27.596548 containerd[1677]: time="2026-01-14T01:17:27.596512026Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:27.598779 containerd[1677]: time="2026-01-14T01:17:27.598747271Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:27.599527 containerd[1677]: time="2026-01-14T01:17:27.599181640Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.376972077s" Jan 14 01:17:27.599600 containerd[1677]: time="2026-01-14T01:17:27.599586299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 14 01:17:27.601099 containerd[1677]: time="2026-01-14T01:17:27.601072986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 14 01:17:27.617473 containerd[1677]: time="2026-01-14T01:17:27.617436901Z" level=info msg="CreateContainer within sandbox \"6292c22dbb43ac65c524b65dc25a399076df0c97508fa87156dd892f09685499\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 14 01:17:27.625245 containerd[1677]: time="2026-01-14T01:17:27.624394647Z" level=info msg="Container c5012f590ea75730dba8eecfda8cea0bb587d8700db45ef5e1351ab6997f1781: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:27.630531 containerd[1677]: time="2026-01-14T01:17:27.630497065Z" level=info msg="CreateContainer within sandbox \"6292c22dbb43ac65c524b65dc25a399076df0c97508fa87156dd892f09685499\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c5012f590ea75730dba8eecfda8cea0bb587d8700db45ef5e1351ab6997f1781\"" Jan 14 01:17:27.630995 containerd[1677]: time="2026-01-14T01:17:27.630963513Z" level=info msg="StartContainer for \"c5012f590ea75730dba8eecfda8cea0bb587d8700db45ef5e1351ab6997f1781\"" Jan 14 01:17:27.631719 containerd[1677]: time="2026-01-14T01:17:27.631692432Z" level=info msg="connecting to shim c5012f590ea75730dba8eecfda8cea0bb587d8700db45ef5e1351ab6997f1781" address="unix:///run/containerd/s/f04661a35290d70b341b1e18554acac5b542ccfc1fad6c5767b181ae0e968a46" protocol=ttrpc version=3 Jan 14 01:17:27.652138 systemd[1]: Started cri-containerd-c5012f590ea75730dba8eecfda8cea0bb587d8700db45ef5e1351ab6997f1781.scope - libcontainer container c5012f590ea75730dba8eecfda8cea0bb587d8700db45ef5e1351ab6997f1781. Jan 14 01:17:27.668000 audit: BPF prog-id=161 op=LOAD Jan 14 01:17:27.674445 kernel: audit: type=1334 audit(1768353447.668:557): prog-id=161 op=LOAD Jan 14 01:17:27.674552 kernel: audit: type=1334 audit(1768353447.669:558): prog-id=162 op=LOAD Jan 14 01:17:27.669000 audit: BPF prog-id=162 op=LOAD Jan 14 01:17:27.682395 kernel: audit: type=1300 audit(1768353447.669:558): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.669000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.690275 kernel: audit: type=1327 audit(1768353447.669:558): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.669000 audit: BPF prog-id=162 op=UNLOAD Jan 14 01:17:27.669000 audit[3488]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.669000 audit: BPF prog-id=163 op=LOAD Jan 14 01:17:27.669000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.669000 audit: BPF prog-id=164 op=LOAD Jan 14 01:17:27.669000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.669000 audit: BPF prog-id=164 op=UNLOAD Jan 14 01:17:27.669000 audit[3488]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.669000 audit: BPF prog-id=163 op=UNLOAD Jan 14 01:17:27.669000 audit[3488]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.669000 audit: BPF prog-id=165 op=LOAD Jan 14 01:17:27.669000 audit[3488]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=3303 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:27.669000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335303132663539306561373537333064626138656563666461386365 Jan 14 01:17:27.708165 containerd[1677]: time="2026-01-14T01:17:27.708131953Z" level=info msg="StartContainer for \"c5012f590ea75730dba8eecfda8cea0bb587d8700db45ef5e1351ab6997f1781\" returns successfully" Jan 14 01:17:27.989863 kubelet[2863]: E0114 01:17:27.989812 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:28.101959 kubelet[2863]: I0114 01:17:28.101835 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5b58b496f6-r485v" podStartSLOduration=1.722937687 podStartE2EDuration="5.101808929s" podCreationTimestamp="2026-01-14 01:17:23 +0000 UTC" firstStartedPulling="2026-01-14 01:17:24.221460365 +0000 UTC m=+19.356379618" lastFinishedPulling="2026-01-14 01:17:27.600331607 +0000 UTC m=+22.735250860" observedRunningTime="2026-01-14 01:17:28.100060103 +0000 UTC m=+23.234979406" watchObservedRunningTime="2026-01-14 01:17:28.101808929 +0000 UTC m=+23.236728232" Jan 14 01:17:28.134195 kubelet[2863]: E0114 01:17:28.134103 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.134195 kubelet[2863]: W0114 01:17:28.134148 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.134195 kubelet[2863]: E0114 01:17:28.134184 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.134699 kubelet[2863]: E0114 01:17:28.134648 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.134699 kubelet[2863]: W0114 01:17:28.134670 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.134699 kubelet[2863]: E0114 01:17:28.134690 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.135219 kubelet[2863]: E0114 01:17:28.135141 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.135219 kubelet[2863]: W0114 01:17:28.135175 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.135219 kubelet[2863]: E0114 01:17:28.135204 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.135720 kubelet[2863]: E0114 01:17:28.135684 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.135720 kubelet[2863]: W0114 01:17:28.135708 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.135827 kubelet[2863]: E0114 01:17:28.135724 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.136583 kubelet[2863]: E0114 01:17:28.136360 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.136583 kubelet[2863]: W0114 01:17:28.136388 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.136583 kubelet[2863]: E0114 01:17:28.136411 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.136934 kubelet[2863]: E0114 01:17:28.136888 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.136934 kubelet[2863]: W0114 01:17:28.136911 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.136934 kubelet[2863]: E0114 01:17:28.136927 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.137558 kubelet[2863]: E0114 01:17:28.137514 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.137558 kubelet[2863]: W0114 01:17:28.137540 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.137558 kubelet[2863]: E0114 01:17:28.137558 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.137977 kubelet[2863]: E0114 01:17:28.137940 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.137977 kubelet[2863]: W0114 01:17:28.137960 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.137977 kubelet[2863]: E0114 01:17:28.137975 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.138427 kubelet[2863]: E0114 01:17:28.138387 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.138427 kubelet[2863]: W0114 01:17:28.138408 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.138427 kubelet[2863]: E0114 01:17:28.138424 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.138850 kubelet[2863]: E0114 01:17:28.138818 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.138850 kubelet[2863]: W0114 01:17:28.138844 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.138991 kubelet[2863]: E0114 01:17:28.138866 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.139475 kubelet[2863]: E0114 01:17:28.139442 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.139530 kubelet[2863]: W0114 01:17:28.139472 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.139640 kubelet[2863]: E0114 01:17:28.139534 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.140127 kubelet[2863]: E0114 01:17:28.140087 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.140127 kubelet[2863]: W0114 01:17:28.140111 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.140127 kubelet[2863]: E0114 01:17:28.140128 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.140766 kubelet[2863]: E0114 01:17:28.140705 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.140766 kubelet[2863]: W0114 01:17:28.140731 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.140766 kubelet[2863]: E0114 01:17:28.140749 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.141359 kubelet[2863]: E0114 01:17:28.141284 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.141359 kubelet[2863]: W0114 01:17:28.141310 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.141359 kubelet[2863]: E0114 01:17:28.141326 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.141861 kubelet[2863]: E0114 01:17:28.141780 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.141861 kubelet[2863]: W0114 01:17:28.141803 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.141861 kubelet[2863]: E0114 01:17:28.141822 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.160174 kubelet[2863]: E0114 01:17:28.160109 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.160174 kubelet[2863]: W0114 01:17:28.160148 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.160174 kubelet[2863]: E0114 01:17:28.160181 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.160775 kubelet[2863]: E0114 01:17:28.160718 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.160775 kubelet[2863]: W0114 01:17:28.160758 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.160882 kubelet[2863]: E0114 01:17:28.160776 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.161424 kubelet[2863]: E0114 01:17:28.161386 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.161424 kubelet[2863]: W0114 01:17:28.161413 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.161542 kubelet[2863]: E0114 01:17:28.161432 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.162003 kubelet[2863]: E0114 01:17:28.161947 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.162003 kubelet[2863]: W0114 01:17:28.161973 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.162003 kubelet[2863]: E0114 01:17:28.161993 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.162549 kubelet[2863]: E0114 01:17:28.162509 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.162549 kubelet[2863]: W0114 01:17:28.162533 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.162677 kubelet[2863]: E0114 01:17:28.162550 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.163159 kubelet[2863]: E0114 01:17:28.163110 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.163159 kubelet[2863]: W0114 01:17:28.163134 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.163159 kubelet[2863]: E0114 01:17:28.163154 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.163806 kubelet[2863]: E0114 01:17:28.163739 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.163806 kubelet[2863]: W0114 01:17:28.163777 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.163806 kubelet[2863]: E0114 01:17:28.163809 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.164536 kubelet[2863]: E0114 01:17:28.164477 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.164536 kubelet[2863]: W0114 01:17:28.164507 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.164536 kubelet[2863]: E0114 01:17:28.164530 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.165306 kubelet[2863]: E0114 01:17:28.165249 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.165306 kubelet[2863]: W0114 01:17:28.165275 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.165306 kubelet[2863]: E0114 01:17:28.165295 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.165809 kubelet[2863]: E0114 01:17:28.165772 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.165809 kubelet[2863]: W0114 01:17:28.165799 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.165914 kubelet[2863]: E0114 01:17:28.165817 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.166401 kubelet[2863]: E0114 01:17:28.166366 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.166460 kubelet[2863]: W0114 01:17:28.166390 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.166460 kubelet[2863]: E0114 01:17:28.166435 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.167263 kubelet[2863]: E0114 01:17:28.167223 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.167263 kubelet[2863]: W0114 01:17:28.167250 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.167373 kubelet[2863]: E0114 01:17:28.167268 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.167716 kubelet[2863]: E0114 01:17:28.167680 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.167716 kubelet[2863]: W0114 01:17:28.167704 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.167818 kubelet[2863]: E0114 01:17:28.167721 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.168194 kubelet[2863]: E0114 01:17:28.168158 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.168194 kubelet[2863]: W0114 01:17:28.168180 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.168276 kubelet[2863]: E0114 01:17:28.168198 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.168728 kubelet[2863]: E0114 01:17:28.168691 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.168728 kubelet[2863]: W0114 01:17:28.168717 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.168844 kubelet[2863]: E0114 01:17:28.168735 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.169739 kubelet[2863]: E0114 01:17:28.169708 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.169739 kubelet[2863]: W0114 01:17:28.169734 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.169848 kubelet[2863]: E0114 01:17:28.169753 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.170290 kubelet[2863]: E0114 01:17:28.170255 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.170290 kubelet[2863]: W0114 01:17:28.170279 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.170443 kubelet[2863]: E0114 01:17:28.170297 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:28.171168 kubelet[2863]: E0114 01:17:28.171098 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:28.171168 kubelet[2863]: W0114 01:17:28.171125 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:28.171168 kubelet[2863]: E0114 01:17:28.171144 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.088132 kubelet[2863]: I0114 01:17:29.088054 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:17:29.151556 kubelet[2863]: E0114 01:17:29.151218 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.151556 kubelet[2863]: W0114 01:17:29.151412 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.152661 kubelet[2863]: E0114 01:17:29.151594 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.153493 kubelet[2863]: E0114 01:17:29.153152 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.153493 kubelet[2863]: W0114 01:17:29.153335 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.153493 kubelet[2863]: E0114 01:17:29.153361 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.154496 kubelet[2863]: E0114 01:17:29.154460 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.154496 kubelet[2863]: W0114 01:17:29.154490 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.154583 kubelet[2863]: E0114 01:17:29.154510 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.155862 kubelet[2863]: E0114 01:17:29.155834 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.155862 kubelet[2863]: W0114 01:17:29.155860 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.156334 kubelet[2863]: E0114 01:17:29.155880 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.157151 kubelet[2863]: E0114 01:17:29.157116 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.157151 kubelet[2863]: W0114 01:17:29.157147 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.157226 kubelet[2863]: E0114 01:17:29.157166 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.158172 kubelet[2863]: E0114 01:17:29.158134 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.158172 kubelet[2863]: W0114 01:17:29.158163 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.158250 kubelet[2863]: E0114 01:17:29.158181 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.158820 kubelet[2863]: E0114 01:17:29.158789 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.158865 kubelet[2863]: W0114 01:17:29.158816 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.158865 kubelet[2863]: E0114 01:17:29.158851 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.160352 kubelet[2863]: E0114 01:17:29.160313 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.160546 kubelet[2863]: W0114 01:17:29.160500 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.160546 kubelet[2863]: E0114 01:17:29.160529 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.162049 kubelet[2863]: E0114 01:17:29.161977 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.162212 kubelet[2863]: W0114 01:17:29.162177 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.162253 kubelet[2863]: E0114 01:17:29.162211 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.163152 kubelet[2863]: E0114 01:17:29.163100 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.163152 kubelet[2863]: W0114 01:17:29.163142 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.163265 kubelet[2863]: E0114 01:17:29.163240 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.164729 kubelet[2863]: E0114 01:17:29.164692 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.164729 kubelet[2863]: W0114 01:17:29.164723 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.164925 kubelet[2863]: E0114 01:17:29.164896 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.166287 kubelet[2863]: E0114 01:17:29.166141 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.166457 kubelet[2863]: W0114 01:17:29.166429 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.166488 kubelet[2863]: E0114 01:17:29.166459 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.169039 kubelet[2863]: E0114 01:17:29.168829 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.169039 kubelet[2863]: W0114 01:17:29.168944 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.169368 kubelet[2863]: E0114 01:17:29.169060 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.172194 kubelet[2863]: E0114 01:17:29.172122 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.172194 kubelet[2863]: W0114 01:17:29.172150 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.172194 kubelet[2863]: E0114 01:17:29.172166 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.172694 kubelet[2863]: E0114 01:17:29.172644 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.172694 kubelet[2863]: W0114 01:17:29.172673 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.172694 kubelet[2863]: E0114 01:17:29.172688 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.177907 kubelet[2863]: E0114 01:17:29.177508 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.177907 kubelet[2863]: W0114 01:17:29.177555 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.177907 kubelet[2863]: E0114 01:17:29.177576 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.178702 kubelet[2863]: E0114 01:17:29.178676 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.178727 kubelet[2863]: W0114 01:17:29.178702 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.178727 kubelet[2863]: E0114 01:17:29.178719 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.179411 kubelet[2863]: E0114 01:17:29.179263 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.179435 kubelet[2863]: W0114 01:17:29.179410 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.179435 kubelet[2863]: E0114 01:17:29.179428 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.180429 kubelet[2863]: E0114 01:17:29.180410 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.180486 kubelet[2863]: W0114 01:17:29.180475 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.180528 kubelet[2863]: E0114 01:17:29.180519 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.180724 kubelet[2863]: E0114 01:17:29.180715 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.181117 kubelet[2863]: W0114 01:17:29.181087 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.181150 kubelet[2863]: E0114 01:17:29.181122 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.181721 kubelet[2863]: E0114 01:17:29.181698 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.181721 kubelet[2863]: W0114 01:17:29.181720 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.181767 kubelet[2863]: E0114 01:17:29.181734 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.182683 kubelet[2863]: E0114 01:17:29.182659 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.182683 kubelet[2863]: W0114 01:17:29.182679 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.182726 kubelet[2863]: E0114 01:17:29.182693 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.183001 kubelet[2863]: E0114 01:17:29.182989 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.183001 kubelet[2863]: W0114 01:17:29.183035 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.183001 kubelet[2863]: E0114 01:17:29.183044 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.183406 kubelet[2863]: E0114 01:17:29.183397 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.183459 kubelet[2863]: W0114 01:17:29.183451 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.183499 kubelet[2863]: E0114 01:17:29.183492 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.183928 kubelet[2863]: E0114 01:17:29.183911 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.184079 kubelet[2863]: W0114 01:17:29.183971 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.184079 kubelet[2863]: E0114 01:17:29.183981 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.184244 kubelet[2863]: E0114 01:17:29.184236 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.184334 kubelet[2863]: W0114 01:17:29.184273 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.184363 kubelet[2863]: E0114 01:17:29.184281 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.184842 kubelet[2863]: E0114 01:17:29.184753 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.184842 kubelet[2863]: W0114 01:17:29.184761 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.184842 kubelet[2863]: E0114 01:17:29.184768 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.185170 kubelet[2863]: E0114 01:17:29.185143 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.185170 kubelet[2863]: W0114 01:17:29.185151 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.185170 kubelet[2863]: E0114 01:17:29.185158 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.185506 kubelet[2863]: E0114 01:17:29.185477 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.185506 kubelet[2863]: W0114 01:17:29.185485 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.185506 kubelet[2863]: E0114 01:17:29.185491 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.185865 kubelet[2863]: E0114 01:17:29.185822 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.185865 kubelet[2863]: W0114 01:17:29.185830 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.185865 kubelet[2863]: E0114 01:17:29.185836 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.186106 kubelet[2863]: E0114 01:17:29.186092 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.186212 kubelet[2863]: W0114 01:17:29.186158 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.186212 kubelet[2863]: E0114 01:17:29.186167 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.186563 kubelet[2863]: E0114 01:17:29.186499 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.186563 kubelet[2863]: W0114 01:17:29.186508 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.186563 kubelet[2863]: E0114 01:17:29.186514 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.186924 kubelet[2863]: E0114 01:17:29.186916 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:17:29.187000 kubelet[2863]: W0114 01:17:29.186963 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:17:29.187000 kubelet[2863]: E0114 01:17:29.186972 2863 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:17:29.198334 containerd[1677]: time="2026-01-14T01:17:29.198276378Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:29.199415 containerd[1677]: time="2026-01-14T01:17:29.199335495Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:29.200464 containerd[1677]: time="2026-01-14T01:17:29.200422604Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:29.202341 containerd[1677]: time="2026-01-14T01:17:29.202253740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:29.202721 containerd[1677]: time="2026-01-14T01:17:29.202666430Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.601568024s" Jan 14 01:17:29.202721 containerd[1677]: time="2026-01-14T01:17:29.202699970Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 14 01:17:29.207119 containerd[1677]: time="2026-01-14T01:17:29.207001372Z" level=info msg="CreateContainer within sandbox \"0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 14 01:17:29.221024 containerd[1677]: time="2026-01-14T01:17:29.220359219Z" level=info msg="Container ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:29.229061 containerd[1677]: time="2026-01-14T01:17:29.228624653Z" level=info msg="CreateContainer within sandbox \"0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d\"" Jan 14 01:17:29.230739 containerd[1677]: time="2026-01-14T01:17:29.229526582Z" level=info msg="StartContainer for \"ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d\"" Jan 14 01:17:29.230083 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3920652293.mount: Deactivated successfully. Jan 14 01:17:29.235494 containerd[1677]: time="2026-01-14T01:17:29.235399091Z" level=info msg="connecting to shim ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d" address="unix:///run/containerd/s/8b747dec2f230515377ffac264a521930b3d1651279be82ea28e9cb31f42d931" protocol=ttrpc version=3 Jan 14 01:17:29.267290 systemd[1]: Started cri-containerd-ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d.scope - libcontainer container ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d. Jan 14 01:17:29.324000 audit: BPF prog-id=166 op=LOAD Jan 14 01:17:29.324000 audit[3597]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3410 pid=3597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:29.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261363063323863336138323765326538383536646565313031663438 Jan 14 01:17:29.324000 audit: BPF prog-id=167 op=LOAD Jan 14 01:17:29.324000 audit[3597]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3410 pid=3597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:29.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261363063323863336138323765326538383536646565313031663438 Jan 14 01:17:29.324000 audit: BPF prog-id=167 op=UNLOAD Jan 14 01:17:29.324000 audit[3597]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:29.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261363063323863336138323765326538383536646565313031663438 Jan 14 01:17:29.324000 audit: BPF prog-id=166 op=UNLOAD Jan 14 01:17:29.324000 audit[3597]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:29.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261363063323863336138323765326538383536646565313031663438 Jan 14 01:17:29.324000 audit: BPF prog-id=168 op=LOAD Jan 14 01:17:29.324000 audit[3597]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3410 pid=3597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:29.324000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261363063323863336138323765326538383536646565313031663438 Jan 14 01:17:29.360519 containerd[1677]: time="2026-01-14T01:17:29.360120169Z" level=info msg="StartContainer for \"ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d\" returns successfully" Jan 14 01:17:29.380674 systemd[1]: cri-containerd-ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d.scope: Deactivated successfully. Jan 14 01:17:29.382510 containerd[1677]: time="2026-01-14T01:17:29.382447429Z" level=info msg="received container exit event container_id:\"ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d\" id:\"ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d\" pid:3610 exited_at:{seconds:1768353449 nanos:382034750}" Jan 14 01:17:29.383000 audit: BPF prog-id=168 op=UNLOAD Jan 14 01:17:29.421087 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ba60c28c3a827e2e8856dee101f48c1fed90143be562b275965ed47959ce821d-rootfs.mount: Deactivated successfully. Jan 14 01:17:29.990688 kubelet[2863]: E0114 01:17:29.990587 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:30.092116 containerd[1677]: time="2026-01-14T01:17:30.091536070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 14 01:17:31.990148 kubelet[2863]: E0114 01:17:31.989939 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:32.564110 containerd[1677]: time="2026-01-14T01:17:32.564054928Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:32.565258 containerd[1677]: time="2026-01-14T01:17:32.565137326Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 14 01:17:32.567664 containerd[1677]: time="2026-01-14T01:17:32.567637463Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:32.569491 containerd[1677]: time="2026-01-14T01:17:32.569465020Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:32.570021 containerd[1677]: time="2026-01-14T01:17:32.569991199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 2.478425289s" Jan 14 01:17:32.570082 containerd[1677]: time="2026-01-14T01:17:32.570072219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 14 01:17:32.574444 containerd[1677]: time="2026-01-14T01:17:32.574416073Z" level=info msg="CreateContainer within sandbox \"0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 14 01:17:32.585022 containerd[1677]: time="2026-01-14T01:17:32.584317980Z" level=info msg="Container f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:32.594167 containerd[1677]: time="2026-01-14T01:17:32.594141316Z" level=info msg="CreateContainer within sandbox \"0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58\"" Jan 14 01:17:32.595025 containerd[1677]: time="2026-01-14T01:17:32.594654715Z" level=info msg="StartContainer for \"f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58\"" Jan 14 01:17:32.595803 containerd[1677]: time="2026-01-14T01:17:32.595787424Z" level=info msg="connecting to shim f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58" address="unix:///run/containerd/s/8b747dec2f230515377ffac264a521930b3d1651279be82ea28e9cb31f42d931" protocol=ttrpc version=3 Jan 14 01:17:32.611276 systemd[1]: Started cri-containerd-f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58.scope - libcontainer container f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58. Jan 14 01:17:32.662000 audit: BPF prog-id=169 op=LOAD Jan 14 01:17:32.663681 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 14 01:17:32.663763 kernel: audit: type=1334 audit(1768353452.662:571): prog-id=169 op=LOAD Jan 14 01:17:32.662000 audit[3656]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.669225 kernel: audit: type=1300 audit(1768353452.662:571): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.674367 kernel: audit: type=1327 audit(1768353452.662:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.662000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.681087 kernel: audit: type=1334 audit(1768353452.665:572): prog-id=170 op=LOAD Jan 14 01:17:32.665000 audit: BPF prog-id=170 op=LOAD Jan 14 01:17:32.665000 audit[3656]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.691096 kernel: audit: type=1300 audit(1768353452.665:572): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.691198 kernel: audit: type=1327 audit(1768353452.665:572): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.665000 audit: BPF prog-id=170 op=UNLOAD Jan 14 01:17:32.703707 kernel: audit: type=1334 audit(1768353452.665:573): prog-id=170 op=UNLOAD Jan 14 01:17:32.703758 kernel: audit: type=1300 audit(1768353452.665:573): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.665000 audit[3656]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.708047 kernel: audit: type=1327 audit(1768353452.665:573): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.710313 kernel: audit: type=1334 audit(1768353452.665:574): prog-id=169 op=UNLOAD Jan 14 01:17:32.665000 audit: BPF prog-id=169 op=UNLOAD Jan 14 01:17:32.665000 audit[3656]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.665000 audit: BPF prog-id=171 op=LOAD Jan 14 01:17:32.665000 audit[3656]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3410 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:32.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6639343365313639346264656564316266663830373161393034393863 Jan 14 01:17:32.719948 containerd[1677]: time="2026-01-14T01:17:32.719914560Z" level=info msg="StartContainer for \"f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58\" returns successfully" Jan 14 01:17:33.245140 containerd[1677]: time="2026-01-14T01:17:33.245082827Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 01:17:33.248072 systemd[1]: cri-containerd-f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58.scope: Deactivated successfully. Jan 14 01:17:33.248452 systemd[1]: cri-containerd-f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58.scope: Consumed 520ms CPU time, 195M memory peak, 171.3M written to disk. Jan 14 01:17:33.250086 containerd[1677]: time="2026-01-14T01:17:33.249982141Z" level=info msg="received container exit event container_id:\"f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58\" id:\"f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58\" pid:3670 exited_at:{seconds:1768353453 nanos:249753631}" Jan 14 01:17:33.251000 audit: BPF prog-id=171 op=UNLOAD Jan 14 01:17:33.276465 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f943e1694bdeed1bff8071a90498c7d00c7d389c047b69151fcd9dcabd961e58-rootfs.mount: Deactivated successfully. Jan 14 01:17:33.284758 kubelet[2863]: I0114 01:17:33.283229 2863 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 14 01:17:33.373351 systemd[1]: Created slice kubepods-burstable-podd2a5e836_30ac_41d3_8f91_c61095ba41ec.slice - libcontainer container kubepods-burstable-podd2a5e836_30ac_41d3_8f91_c61095ba41ec.slice. Jan 14 01:17:33.389534 systemd[1]: Created slice kubepods-besteffort-pod8ba22e1c_b895_4e68_8414_171d12dc9bef.slice - libcontainer container kubepods-besteffort-pod8ba22e1c_b895_4e68_8414_171d12dc9bef.slice. Jan 14 01:17:33.398575 systemd[1]: Created slice kubepods-burstable-podcd931b03_7a61_4463_8de1_782ce3a3b938.slice - libcontainer container kubepods-burstable-podcd931b03_7a61_4463_8de1_782ce3a3b938.slice. Jan 14 01:17:33.409748 systemd[1]: Created slice kubepods-besteffort-pode9b7dca9_0cc9_40e4_b746_163e923a9fd3.slice - libcontainer container kubepods-besteffort-pode9b7dca9_0cc9_40e4_b746_163e923a9fd3.slice. Jan 14 01:17:33.412575 kubelet[2863]: I0114 01:17:33.412507 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcsgt\" (UniqueName: \"kubernetes.io/projected/8ba22e1c-b895-4e68-8414-171d12dc9bef-kube-api-access-pcsgt\") pod \"calico-kube-controllers-85b4f9c766-pwdc6\" (UID: \"8ba22e1c-b895-4e68-8414-171d12dc9bef\") " pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" Jan 14 01:17:33.412575 kubelet[2863]: I0114 01:17:33.412533 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd931b03-7a61-4463-8de1-782ce3a3b938-config-volume\") pod \"coredns-674b8bbfcf-7p9cs\" (UID: \"cd931b03-7a61-4463-8de1-782ce3a3b938\") " pod="kube-system/coredns-674b8bbfcf-7p9cs" Jan 14 01:17:33.412575 kubelet[2863]: I0114 01:17:33.412548 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtwj4\" (UniqueName: \"kubernetes.io/projected/d2a5e836-30ac-41d3-8f91-c61095ba41ec-kube-api-access-xtwj4\") pod \"coredns-674b8bbfcf-fq4fr\" (UID: \"d2a5e836-30ac-41d3-8f91-c61095ba41ec\") " pod="kube-system/coredns-674b8bbfcf-fq4fr" Jan 14 01:17:33.412829 kubelet[2863]: I0114 01:17:33.412761 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-backend-key-pair\") pod \"whisker-d9b94fbbb-hpdvb\" (UID: \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\") " pod="calico-system/whisker-d9b94fbbb-hpdvb" Jan 14 01:17:33.412829 kubelet[2863]: I0114 01:17:33.412778 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-ca-bundle\") pod \"whisker-d9b94fbbb-hpdvb\" (UID: \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\") " pod="calico-system/whisker-d9b94fbbb-hpdvb" Jan 14 01:17:33.412829 kubelet[2863]: I0114 01:17:33.412793 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbhpb\" (UniqueName: \"kubernetes.io/projected/cd931b03-7a61-4463-8de1-782ce3a3b938-kube-api-access-wbhpb\") pod \"coredns-674b8bbfcf-7p9cs\" (UID: \"cd931b03-7a61-4463-8de1-782ce3a3b938\") " pod="kube-system/coredns-674b8bbfcf-7p9cs" Jan 14 01:17:33.412829 kubelet[2863]: I0114 01:17:33.412804 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlp4v\" (UniqueName: \"kubernetes.io/projected/e9b7dca9-0cc9-40e4-b746-163e923a9fd3-kube-api-access-mlp4v\") pod \"calico-apiserver-75bdffd95c-w8fr4\" (UID: \"e9b7dca9-0cc9-40e4-b746-163e923a9fd3\") " pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" Jan 14 01:17:33.412990 kubelet[2863]: I0114 01:17:33.412914 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0c40b23d-843d-4125-8dd6-0c99d44bb1dc-calico-apiserver-certs\") pod \"calico-apiserver-75bdffd95c-dllbx\" (UID: \"0c40b23d-843d-4125-8dd6-0c99d44bb1dc\") " pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" Jan 14 01:17:33.412990 kubelet[2863]: I0114 01:17:33.412931 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a5e836-30ac-41d3-8f91-c61095ba41ec-config-volume\") pod \"coredns-674b8bbfcf-fq4fr\" (UID: \"d2a5e836-30ac-41d3-8f91-c61095ba41ec\") " pod="kube-system/coredns-674b8bbfcf-fq4fr" Jan 14 01:17:33.412990 kubelet[2863]: I0114 01:17:33.412948 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a11f31d9-176c-489f-9754-8429b5bd5389-config\") pod \"goldmane-666569f655-l6m4k\" (UID: \"a11f31d9-176c-489f-9754-8429b5bd5389\") " pod="calico-system/goldmane-666569f655-l6m4k" Jan 14 01:17:33.412990 kubelet[2863]: I0114 01:17:33.412961 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a11f31d9-176c-489f-9754-8429b5bd5389-goldmane-key-pair\") pod \"goldmane-666569f655-l6m4k\" (UID: \"a11f31d9-176c-489f-9754-8429b5bd5389\") " pod="calico-system/goldmane-666569f655-l6m4k" Jan 14 01:17:33.412990 kubelet[2863]: I0114 01:17:33.412974 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4blz\" (UniqueName: \"kubernetes.io/projected/f4a68be9-46a2-4474-b3c6-09ddae1292f7-kube-api-access-t4blz\") pod \"whisker-d9b94fbbb-hpdvb\" (UID: \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\") " pod="calico-system/whisker-d9b94fbbb-hpdvb" Jan 14 01:17:33.413188 kubelet[2863]: I0114 01:17:33.413122 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11f31d9-176c-489f-9754-8429b5bd5389-goldmane-ca-bundle\") pod \"goldmane-666569f655-l6m4k\" (UID: \"a11f31d9-176c-489f-9754-8429b5bd5389\") " pod="calico-system/goldmane-666569f655-l6m4k" Jan 14 01:17:33.413188 kubelet[2863]: I0114 01:17:33.413140 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snjlj\" (UniqueName: \"kubernetes.io/projected/a11f31d9-176c-489f-9754-8429b5bd5389-kube-api-access-snjlj\") pod \"goldmane-666569f655-l6m4k\" (UID: \"a11f31d9-176c-489f-9754-8429b5bd5389\") " pod="calico-system/goldmane-666569f655-l6m4k" Jan 14 01:17:33.413188 kubelet[2863]: I0114 01:17:33.413153 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e9b7dca9-0cc9-40e4-b746-163e923a9fd3-calico-apiserver-certs\") pod \"calico-apiserver-75bdffd95c-w8fr4\" (UID: \"e9b7dca9-0cc9-40e4-b746-163e923a9fd3\") " pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" Jan 14 01:17:33.413188 kubelet[2863]: I0114 01:17:33.413167 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwmt\" (UniqueName: \"kubernetes.io/projected/0c40b23d-843d-4125-8dd6-0c99d44bb1dc-kube-api-access-ncwmt\") pod \"calico-apiserver-75bdffd95c-dllbx\" (UID: \"0c40b23d-843d-4125-8dd6-0c99d44bb1dc\") " pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" Jan 14 01:17:33.413395 kubelet[2863]: I0114 01:17:33.413314 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ba22e1c-b895-4e68-8414-171d12dc9bef-tigera-ca-bundle\") pod \"calico-kube-controllers-85b4f9c766-pwdc6\" (UID: \"8ba22e1c-b895-4e68-8414-171d12dc9bef\") " pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" Jan 14 01:17:33.420928 systemd[1]: Created slice kubepods-besteffort-pod0c40b23d_843d_4125_8dd6_0c99d44bb1dc.slice - libcontainer container kubepods-besteffort-pod0c40b23d_843d_4125_8dd6_0c99d44bb1dc.slice. Jan 14 01:17:33.427835 systemd[1]: Created slice kubepods-besteffort-podf4a68be9_46a2_4474_b3c6_09ddae1292f7.slice - libcontainer container kubepods-besteffort-podf4a68be9_46a2_4474_b3c6_09ddae1292f7.slice. Jan 14 01:17:33.433894 systemd[1]: Created slice kubepods-besteffort-poda11f31d9_176c_489f_9754_8429b5bd5389.slice - libcontainer container kubepods-besteffort-poda11f31d9_176c_489f_9754_8429b5bd5389.slice. Jan 14 01:17:33.683789 containerd[1677]: time="2026-01-14T01:17:33.683604886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fq4fr,Uid:d2a5e836-30ac-41d3-8f91-c61095ba41ec,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:33.695440 containerd[1677]: time="2026-01-14T01:17:33.695132651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4f9c766-pwdc6,Uid:8ba22e1c-b895-4e68-8414-171d12dc9bef,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:33.707930 containerd[1677]: time="2026-01-14T01:17:33.707871105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7p9cs,Uid:cd931b03-7a61-4463-8de1-782ce3a3b938,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:33.718672 containerd[1677]: time="2026-01-14T01:17:33.718575531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-w8fr4,Uid:e9b7dca9-0cc9-40e4-b746-163e923a9fd3,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:17:33.726854 containerd[1677]: time="2026-01-14T01:17:33.726780551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-dllbx,Uid:0c40b23d-843d-4125-8dd6-0c99d44bb1dc,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:17:33.734493 containerd[1677]: time="2026-01-14T01:17:33.734437361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d9b94fbbb-hpdvb,Uid:f4a68be9-46a2-4474-b3c6-09ddae1292f7,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:33.739767 containerd[1677]: time="2026-01-14T01:17:33.739678624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-l6m4k,Uid:a11f31d9-176c-489f-9754-8429b5bd5389,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:33.841331 containerd[1677]: time="2026-01-14T01:17:33.841278653Z" level=error msg="Failed to destroy network for sandbox \"d0ab27c62371e500610505681ec3412271270703fb0f456a7337d59855cdacb4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.844817 containerd[1677]: time="2026-01-14T01:17:33.844779980Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fq4fr,Uid:d2a5e836-30ac-41d3-8f91-c61095ba41ec,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ab27c62371e500610505681ec3412271270703fb0f456a7337d59855cdacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.845561 kubelet[2863]: E0114 01:17:33.845473 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ab27c62371e500610505681ec3412271270703fb0f456a7337d59855cdacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.845561 kubelet[2863]: E0114 01:17:33.845546 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ab27c62371e500610505681ec3412271270703fb0f456a7337d59855cdacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fq4fr" Jan 14 01:17:33.845662 kubelet[2863]: E0114 01:17:33.845571 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ab27c62371e500610505681ec3412271270703fb0f456a7337d59855cdacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fq4fr" Jan 14 01:17:33.845662 kubelet[2863]: E0114 01:17:33.845627 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-fq4fr_kube-system(d2a5e836-30ac-41d3-8f91-c61095ba41ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-fq4fr_kube-system(d2a5e836-30ac-41d3-8f91-c61095ba41ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d0ab27c62371e500610505681ec3412271270703fb0f456a7337d59855cdacb4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-fq4fr" podUID="d2a5e836-30ac-41d3-8f91-c61095ba41ec" Jan 14 01:17:33.884805 containerd[1677]: time="2026-01-14T01:17:33.884747249Z" level=error msg="Failed to destroy network for sandbox \"4cee41a3231be4b1377b6fed14af92fa014ead1cb000c0381e50bf6073a897a1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.887815 containerd[1677]: time="2026-01-14T01:17:33.887683225Z" level=error msg="Failed to destroy network for sandbox \"df5b5aca9e14ab61472ea4259977d655cec330949f12145531999da7e8dcc16f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.888374 containerd[1677]: time="2026-01-14T01:17:33.888354564Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-l6m4k,Uid:a11f31d9-176c-489f-9754-8429b5bd5389,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cee41a3231be4b1377b6fed14af92fa014ead1cb000c0381e50bf6073a897a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.888839 kubelet[2863]: E0114 01:17:33.888710 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cee41a3231be4b1377b6fed14af92fa014ead1cb000c0381e50bf6073a897a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.888917 kubelet[2863]: E0114 01:17:33.888859 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cee41a3231be4b1377b6fed14af92fa014ead1cb000c0381e50bf6073a897a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-l6m4k" Jan 14 01:17:33.888917 kubelet[2863]: E0114 01:17:33.888895 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cee41a3231be4b1377b6fed14af92fa014ead1cb000c0381e50bf6073a897a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-l6m4k" Jan 14 01:17:33.889371 kubelet[2863]: E0114 01:17:33.889045 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-l6m4k_calico-system(a11f31d9-176c-489f-9754-8429b5bd5389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-l6m4k_calico-system(a11f31d9-176c-489f-9754-8429b5bd5389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4cee41a3231be4b1377b6fed14af92fa014ead1cb000c0381e50bf6073a897a1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:17:33.890904 containerd[1677]: time="2026-01-14T01:17:33.890864630Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7p9cs,Uid:cd931b03-7a61-4463-8de1-782ce3a3b938,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b5aca9e14ab61472ea4259977d655cec330949f12145531999da7e8dcc16f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.891388 kubelet[2863]: E0114 01:17:33.891337 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b5aca9e14ab61472ea4259977d655cec330949f12145531999da7e8dcc16f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.891541 kubelet[2863]: E0114 01:17:33.891472 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b5aca9e14ab61472ea4259977d655cec330949f12145531999da7e8dcc16f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-7p9cs" Jan 14 01:17:33.891541 kubelet[2863]: E0114 01:17:33.891489 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b5aca9e14ab61472ea4259977d655cec330949f12145531999da7e8dcc16f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-7p9cs" Jan 14 01:17:33.891691 kubelet[2863]: E0114 01:17:33.891661 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-7p9cs_kube-system(cd931b03-7a61-4463-8de1-782ce3a3b938)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-7p9cs_kube-system(cd931b03-7a61-4463-8de1-782ce3a3b938)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"df5b5aca9e14ab61472ea4259977d655cec330949f12145531999da7e8dcc16f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-7p9cs" podUID="cd931b03-7a61-4463-8de1-782ce3a3b938" Jan 14 01:17:33.898262 containerd[1677]: time="2026-01-14T01:17:33.898127921Z" level=error msg="Failed to destroy network for sandbox \"afbd9c823fe9bd862b7b048c75eb200ba848432d4607820c72ffe98b6d84190e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.901715 containerd[1677]: time="2026-01-14T01:17:33.901655567Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4f9c766-pwdc6,Uid:8ba22e1c-b895-4e68-8414-171d12dc9bef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"afbd9c823fe9bd862b7b048c75eb200ba848432d4607820c72ffe98b6d84190e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.902742 kubelet[2863]: E0114 01:17:33.901850 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afbd9c823fe9bd862b7b048c75eb200ba848432d4607820c72ffe98b6d84190e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.902742 kubelet[2863]: E0114 01:17:33.901891 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afbd9c823fe9bd862b7b048c75eb200ba848432d4607820c72ffe98b6d84190e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" Jan 14 01:17:33.902742 kubelet[2863]: E0114 01:17:33.901910 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afbd9c823fe9bd862b7b048c75eb200ba848432d4607820c72ffe98b6d84190e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" Jan 14 01:17:33.902831 kubelet[2863]: E0114 01:17:33.901947 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-85b4f9c766-pwdc6_calico-system(8ba22e1c-b895-4e68-8414-171d12dc9bef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-85b4f9c766-pwdc6_calico-system(8ba22e1c-b895-4e68-8414-171d12dc9bef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"afbd9c823fe9bd862b7b048c75eb200ba848432d4607820c72ffe98b6d84190e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:17:33.908013 containerd[1677]: time="2026-01-14T01:17:33.907940388Z" level=error msg="Failed to destroy network for sandbox \"f2a76900d28e37dd60fa9076084480c021db305dd699afbc07f67ed9d37eaa30\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.909740 containerd[1677]: time="2026-01-14T01:17:33.909700016Z" level=error msg="Failed to destroy network for sandbox \"f968b4a06b65e040a92b28e8af399a3ca7331cdd787c4068044686dfcd10a0ba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.910447 containerd[1677]: time="2026-01-14T01:17:33.910426085Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-w8fr4,Uid:e9b7dca9-0cc9-40e4-b746-163e923a9fd3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a76900d28e37dd60fa9076084480c021db305dd699afbc07f67ed9d37eaa30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.911223 kubelet[2863]: E0114 01:17:33.910720 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a76900d28e37dd60fa9076084480c021db305dd699afbc07f67ed9d37eaa30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.911223 kubelet[2863]: E0114 01:17:33.910767 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a76900d28e37dd60fa9076084480c021db305dd699afbc07f67ed9d37eaa30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" Jan 14 01:17:33.911223 kubelet[2863]: E0114 01:17:33.910789 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a76900d28e37dd60fa9076084480c021db305dd699afbc07f67ed9d37eaa30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" Jan 14 01:17:33.911324 kubelet[2863]: E0114 01:17:33.910828 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75bdffd95c-w8fr4_calico-apiserver(e9b7dca9-0cc9-40e4-b746-163e923a9fd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75bdffd95c-w8fr4_calico-apiserver(e9b7dca9-0cc9-40e4-b746-163e923a9fd3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f2a76900d28e37dd60fa9076084480c021db305dd699afbc07f67ed9d37eaa30\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:17:33.911811 containerd[1677]: time="2026-01-14T01:17:33.911266065Z" level=error msg="Failed to destroy network for sandbox \"bc25caf5b24bd110ca657bca083aadb2acf34ffa15788581601bddb81240c3a7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.912732 containerd[1677]: time="2026-01-14T01:17:33.912424283Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-dllbx,Uid:0c40b23d-843d-4125-8dd6-0c99d44bb1dc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f968b4a06b65e040a92b28e8af399a3ca7331cdd787c4068044686dfcd10a0ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.912808 kubelet[2863]: E0114 01:17:33.912547 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f968b4a06b65e040a92b28e8af399a3ca7331cdd787c4068044686dfcd10a0ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.912808 kubelet[2863]: E0114 01:17:33.912578 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f968b4a06b65e040a92b28e8af399a3ca7331cdd787c4068044686dfcd10a0ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" Jan 14 01:17:33.912808 kubelet[2863]: E0114 01:17:33.912591 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f968b4a06b65e040a92b28e8af399a3ca7331cdd787c4068044686dfcd10a0ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" Jan 14 01:17:33.912868 kubelet[2863]: E0114 01:17:33.912654 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75bdffd95c-dllbx_calico-apiserver(0c40b23d-843d-4125-8dd6-0c99d44bb1dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75bdffd95c-dllbx_calico-apiserver(0c40b23d-843d-4125-8dd6-0c99d44bb1dc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f968b4a06b65e040a92b28e8af399a3ca7331cdd787c4068044686dfcd10a0ba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:17:33.914086 containerd[1677]: time="2026-01-14T01:17:33.914060111Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d9b94fbbb-hpdvb,Uid:f4a68be9-46a2-4474-b3c6-09ddae1292f7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc25caf5b24bd110ca657bca083aadb2acf34ffa15788581601bddb81240c3a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.914209 kubelet[2863]: E0114 01:17:33.914182 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc25caf5b24bd110ca657bca083aadb2acf34ffa15788581601bddb81240c3a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:33.914253 kubelet[2863]: E0114 01:17:33.914230 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc25caf5b24bd110ca657bca083aadb2acf34ffa15788581601bddb81240c3a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d9b94fbbb-hpdvb" Jan 14 01:17:33.914282 kubelet[2863]: E0114 01:17:33.914250 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc25caf5b24bd110ca657bca083aadb2acf34ffa15788581601bddb81240c3a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d9b94fbbb-hpdvb" Jan 14 01:17:33.914304 kubelet[2863]: E0114 01:17:33.914280 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-d9b94fbbb-hpdvb_calico-system(f4a68be9-46a2-4474-b3c6-09ddae1292f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-d9b94fbbb-hpdvb_calico-system(f4a68be9-46a2-4474-b3c6-09ddae1292f7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc25caf5b24bd110ca657bca083aadb2acf34ffa15788581601bddb81240c3a7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d9b94fbbb-hpdvb" podUID="f4a68be9-46a2-4474-b3c6-09ddae1292f7" Jan 14 01:17:34.000142 systemd[1]: Created slice kubepods-besteffort-podf9f7ed9b_3d95_4d57_9bbf_9bd4bf98db1b.slice - libcontainer container kubepods-besteffort-podf9f7ed9b_3d95_4d57_9bbf_9bd4bf98db1b.slice. Jan 14 01:17:34.004829 containerd[1677]: time="2026-01-14T01:17:34.004749516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pj96q,Uid:f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:34.057739 containerd[1677]: time="2026-01-14T01:17:34.057688204Z" level=error msg="Failed to destroy network for sandbox \"88413aa56a3dc04490662ff6cc34c07ca8113be59a13f5b4eab1bef2c954f0b7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:34.060605 containerd[1677]: time="2026-01-14T01:17:34.060565580Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pj96q,Uid:f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"88413aa56a3dc04490662ff6cc34c07ca8113be59a13f5b4eab1bef2c954f0b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:34.060857 kubelet[2863]: E0114 01:17:34.060796 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88413aa56a3dc04490662ff6cc34c07ca8113be59a13f5b4eab1bef2c954f0b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:17:34.060917 kubelet[2863]: E0114 01:17:34.060866 2863 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88413aa56a3dc04490662ff6cc34c07ca8113be59a13f5b4eab1bef2c954f0b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pj96q" Jan 14 01:17:34.060917 kubelet[2863]: E0114 01:17:34.060883 2863 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88413aa56a3dc04490662ff6cc34c07ca8113be59a13f5b4eab1bef2c954f0b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pj96q" Jan 14 01:17:34.061077 kubelet[2863]: E0114 01:17:34.060940 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88413aa56a3dc04490662ff6cc34c07ca8113be59a13f5b4eab1bef2c954f0b7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:34.112358 containerd[1677]: time="2026-01-14T01:17:34.112289789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 14 01:17:34.586199 systemd[1]: run-netns-cni\x2def2ee494\x2d110a\x2df9f8\x2d4535\x2d55bd9acd2423.mount: Deactivated successfully. Jan 14 01:17:34.586378 systemd[1]: run-netns-cni\x2dcf496c9e\x2dd9cd\x2d3abc\x2d6e2d\x2db3b4ea6ffa47.mount: Deactivated successfully. Jan 14 01:17:34.586500 systemd[1]: run-netns-cni\x2d28fc52cb\x2d50a7\x2d7b4a\x2dbff7\x2d0f20b57a913e.mount: Deactivated successfully. Jan 14 01:17:34.586618 systemd[1]: run-netns-cni\x2dd4c61951\x2d8a52\x2d4b1c\x2dbb81\x2dbc81cddc5d6a.mount: Deactivated successfully. Jan 14 01:17:38.222344 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1288036338.mount: Deactivated successfully. Jan 14 01:17:38.250213 containerd[1677]: time="2026-01-14T01:17:38.250148615Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:38.251274 containerd[1677]: time="2026-01-14T01:17:38.251238734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 14 01:17:38.252321 containerd[1677]: time="2026-01-14T01:17:38.252272744Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:38.253639 containerd[1677]: time="2026-01-14T01:17:38.253617922Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:17:38.254216 containerd[1677]: time="2026-01-14T01:17:38.253943982Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 4.141616343s" Jan 14 01:17:38.254216 containerd[1677]: time="2026-01-14T01:17:38.253976792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 14 01:17:38.272891 containerd[1677]: time="2026-01-14T01:17:38.272840086Z" level=info msg="CreateContainer within sandbox \"0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 14 01:17:38.285758 containerd[1677]: time="2026-01-14T01:17:38.284329247Z" level=info msg="Container c799c0fa5c39df2093a23598fe9e167e3e2aecae9f66b927a7e3b2fb2384235c: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:38.288992 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2666385240.mount: Deactivated successfully. Jan 14 01:17:38.292237 containerd[1677]: time="2026-01-14T01:17:38.292192681Z" level=info msg="CreateContainer within sandbox \"0318d7ef06e0b80b74628b85ea6d662e64e6f96d123aec203f13d58424cda779\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c799c0fa5c39df2093a23598fe9e167e3e2aecae9f66b927a7e3b2fb2384235c\"" Jan 14 01:17:38.292842 containerd[1677]: time="2026-01-14T01:17:38.292779580Z" level=info msg="StartContainer for \"c799c0fa5c39df2093a23598fe9e167e3e2aecae9f66b927a7e3b2fb2384235c\"" Jan 14 01:17:38.294093 containerd[1677]: time="2026-01-14T01:17:38.294037219Z" level=info msg="connecting to shim c799c0fa5c39df2093a23598fe9e167e3e2aecae9f66b927a7e3b2fb2384235c" address="unix:///run/containerd/s/8b747dec2f230515377ffac264a521930b3d1651279be82ea28e9cb31f42d931" protocol=ttrpc version=3 Jan 14 01:17:38.332189 systemd[1]: Started cri-containerd-c799c0fa5c39df2093a23598fe9e167e3e2aecae9f66b927a7e3b2fb2384235c.scope - libcontainer container c799c0fa5c39df2093a23598fe9e167e3e2aecae9f66b927a7e3b2fb2384235c. Jan 14 01:17:38.385000 audit: BPF prog-id=172 op=LOAD Jan 14 01:17:38.387771 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 14 01:17:38.387846 kernel: audit: type=1334 audit(1768353458.385:577): prog-id=172 op=LOAD Jan 14 01:17:38.385000 audit[3928]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.392169 kernel: audit: type=1300 audit(1768353458.385:577): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.405461 kernel: audit: type=1327 audit(1768353458.385:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.407664 kernel: audit: type=1334 audit(1768353458.385:578): prog-id=173 op=LOAD Jan 14 01:17:38.385000 audit: BPF prog-id=173 op=LOAD Jan 14 01:17:38.414306 kernel: audit: type=1300 audit(1768353458.385:578): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.385000 audit[3928]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.385000 audit: BPF prog-id=173 op=UNLOAD Jan 14 01:17:38.421504 kernel: audit: type=1327 audit(1768353458.385:578): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.421567 kernel: audit: type=1334 audit(1768353458.385:579): prog-id=173 op=UNLOAD Jan 14 01:17:38.385000 audit[3928]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.424123 kernel: audit: type=1300 audit(1768353458.385:579): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.436506 containerd[1677]: time="2026-01-14T01:17:38.436446023Z" level=info msg="StartContainer for \"c799c0fa5c39df2093a23598fe9e167e3e2aecae9f66b927a7e3b2fb2384235c\" returns successfully" Jan 14 01:17:38.385000 audit: BPF prog-id=172 op=UNLOAD Jan 14 01:17:38.437103 kernel: audit: type=1327 audit(1768353458.385:579): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.437191 kernel: audit: type=1334 audit(1768353458.385:580): prog-id=172 op=UNLOAD Jan 14 01:17:38.385000 audit[3928]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.385000 audit: BPF prog-id=174 op=LOAD Jan 14 01:17:38.385000 audit[3928]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3410 pid=3928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:38.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337393963306661356333396466323039336132333539386665396531 Jan 14 01:17:38.524784 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 14 01:17:38.524948 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 14 01:17:38.750809 kubelet[2863]: I0114 01:17:38.750774 2863 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-ca-bundle\") pod \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\" (UID: \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\") " Jan 14 01:17:38.750809 kubelet[2863]: I0114 01:17:38.750807 2863 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4blz\" (UniqueName: \"kubernetes.io/projected/f4a68be9-46a2-4474-b3c6-09ddae1292f7-kube-api-access-t4blz\") pod \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\" (UID: \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\") " Jan 14 01:17:38.751296 kubelet[2863]: I0114 01:17:38.750826 2863 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-backend-key-pair\") pod \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\" (UID: \"f4a68be9-46a2-4474-b3c6-09ddae1292f7\") " Jan 14 01:17:38.752753 kubelet[2863]: I0114 01:17:38.752720 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "f4a68be9-46a2-4474-b3c6-09ddae1292f7" (UID: "f4a68be9-46a2-4474-b3c6-09ddae1292f7"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 14 01:17:38.756826 kubelet[2863]: I0114 01:17:38.756790 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "f4a68be9-46a2-4474-b3c6-09ddae1292f7" (UID: "f4a68be9-46a2-4474-b3c6-09ddae1292f7"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 14 01:17:38.757051 kubelet[2863]: I0114 01:17:38.757028 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a68be9-46a2-4474-b3c6-09ddae1292f7-kube-api-access-t4blz" (OuterVolumeSpecName: "kube-api-access-t4blz") pod "f4a68be9-46a2-4474-b3c6-09ddae1292f7" (UID: "f4a68be9-46a2-4474-b3c6-09ddae1292f7"). InnerVolumeSpecName "kube-api-access-t4blz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 14 01:17:38.852166 kubelet[2863]: I0114 01:17:38.852103 2863 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-backend-key-pair\") on node \"ci-4578-0-0-p-2c3a114250\" DevicePath \"\"" Jan 14 01:17:38.852166 kubelet[2863]: I0114 01:17:38.852165 2863 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4a68be9-46a2-4474-b3c6-09ddae1292f7-whisker-ca-bundle\") on node \"ci-4578-0-0-p-2c3a114250\" DevicePath \"\"" Jan 14 01:17:38.852365 kubelet[2863]: I0114 01:17:38.852193 2863 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4blz\" (UniqueName: \"kubernetes.io/projected/f4a68be9-46a2-4474-b3c6-09ddae1292f7-kube-api-access-t4blz\") on node \"ci-4578-0-0-p-2c3a114250\" DevicePath \"\"" Jan 14 01:17:39.002271 systemd[1]: Removed slice kubepods-besteffort-podf4a68be9_46a2_4474_b3c6_09ddae1292f7.slice - libcontainer container kubepods-besteffort-podf4a68be9_46a2_4474_b3c6_09ddae1292f7.slice. Jan 14 01:17:39.145917 kubelet[2863]: I0114 01:17:39.145246 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-qwbj7" podStartSLOduration=2.340288246 podStartE2EDuration="16.14523022s" podCreationTimestamp="2026-01-14 01:17:23 +0000 UTC" firstStartedPulling="2026-01-14 01:17:24.449672468 +0000 UTC m=+19.584591721" lastFinishedPulling="2026-01-14 01:17:38.254614442 +0000 UTC m=+33.389533695" observedRunningTime="2026-01-14 01:17:39.142945302 +0000 UTC m=+34.277864565" watchObservedRunningTime="2026-01-14 01:17:39.14523022 +0000 UTC m=+34.280149483" Jan 14 01:17:39.211107 systemd[1]: Created slice kubepods-besteffort-pode35dc45d_4646_4773_8f3b_b3b00ec76393.slice - libcontainer container kubepods-besteffort-pode35dc45d_4646_4773_8f3b_b3b00ec76393.slice. Jan 14 01:17:39.226655 systemd[1]: var-lib-kubelet-pods-f4a68be9\x2d46a2\x2d4474\x2db3c6\x2d09ddae1292f7-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dt4blz.mount: Deactivated successfully. Jan 14 01:17:39.226850 systemd[1]: var-lib-kubelet-pods-f4a68be9\x2d46a2\x2d4474\x2db3c6\x2d09ddae1292f7-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 14 01:17:39.255332 kubelet[2863]: I0114 01:17:39.255037 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35dc45d-4646-4773-8f3b-b3b00ec76393-whisker-ca-bundle\") pod \"whisker-76bdf97696-8ln4x\" (UID: \"e35dc45d-4646-4773-8f3b-b3b00ec76393\") " pod="calico-system/whisker-76bdf97696-8ln4x" Jan 14 01:17:39.255332 kubelet[2863]: I0114 01:17:39.255091 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgrmv\" (UniqueName: \"kubernetes.io/projected/e35dc45d-4646-4773-8f3b-b3b00ec76393-kube-api-access-pgrmv\") pod \"whisker-76bdf97696-8ln4x\" (UID: \"e35dc45d-4646-4773-8f3b-b3b00ec76393\") " pod="calico-system/whisker-76bdf97696-8ln4x" Jan 14 01:17:39.255332 kubelet[2863]: I0114 01:17:39.255127 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e35dc45d-4646-4773-8f3b-b3b00ec76393-whisker-backend-key-pair\") pod \"whisker-76bdf97696-8ln4x\" (UID: \"e35dc45d-4646-4773-8f3b-b3b00ec76393\") " pod="calico-system/whisker-76bdf97696-8ln4x" Jan 14 01:17:39.519114 containerd[1677]: time="2026-01-14T01:17:39.518936615Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76bdf97696-8ln4x,Uid:e35dc45d-4646-4773-8f3b-b3b00ec76393,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:39.772432 systemd-networkd[1572]: cali8bfa8f6d423: Link UP Jan 14 01:17:39.774613 systemd-networkd[1572]: cali8bfa8f6d423: Gained carrier Jan 14 01:17:39.805056 containerd[1677]: 2026-01-14 01:17:39.568 [INFO][3996] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:39.805056 containerd[1677]: 2026-01-14 01:17:39.637 [INFO][3996] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0 whisker-76bdf97696- calico-system e35dc45d-4646-4773-8f3b-b3b00ec76393 871 0 2026-01-14 01:17:39 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:76bdf97696 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 whisker-76bdf97696-8ln4x eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali8bfa8f6d423 [] [] }} ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-" Jan 14 01:17:39.805056 containerd[1677]: 2026-01-14 01:17:39.637 [INFO][3996] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" Jan 14 01:17:39.805056 containerd[1677]: 2026-01-14 01:17:39.687 [INFO][4008] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" HandleID="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Workload="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.688 [INFO][4008] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" HandleID="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Workload="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-2c3a114250", "pod":"whisker-76bdf97696-8ln4x", "timestamp":"2026-01-14 01:17:39.687734001 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.688 [INFO][4008] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.688 [INFO][4008] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.688 [INFO][4008] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.700 [INFO][4008] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.708 [INFO][4008] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.716 [INFO][4008] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.719 [INFO][4008] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805414 containerd[1677]: 2026-01-14 01:17:39.724 [INFO][4008] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805797 containerd[1677]: 2026-01-14 01:17:39.724 [INFO][4008] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805797 containerd[1677]: 2026-01-14 01:17:39.727 [INFO][4008] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108 Jan 14 01:17:39.805797 containerd[1677]: 2026-01-14 01:17:39.734 [INFO][4008] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805797 containerd[1677]: 2026-01-14 01:17:39.741 [INFO][4008] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.193/26] block=192.168.92.192/26 handle="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805797 containerd[1677]: 2026-01-14 01:17:39.741 [INFO][4008] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.193/26] handle="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:39.805797 containerd[1677]: 2026-01-14 01:17:39.741 [INFO][4008] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:39.805797 containerd[1677]: 2026-01-14 01:17:39.741 [INFO][4008] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.193/26] IPv6=[] ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" HandleID="k8s-pod-network.23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Workload="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" Jan 14 01:17:39.806186 containerd[1677]: 2026-01-14 01:17:39.749 [INFO][3996] cni-plugin/k8s.go 418: Populated endpoint ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0", GenerateName:"whisker-76bdf97696-", Namespace:"calico-system", SelfLink:"", UID:"e35dc45d-4646-4773-8f3b-b3b00ec76393", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"76bdf97696", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"whisker-76bdf97696-8ln4x", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.92.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8bfa8f6d423", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:39.806186 containerd[1677]: 2026-01-14 01:17:39.750 [INFO][3996] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.193/32] ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" Jan 14 01:17:39.806328 containerd[1677]: 2026-01-14 01:17:39.750 [INFO][3996] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8bfa8f6d423 ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" Jan 14 01:17:39.806328 containerd[1677]: 2026-01-14 01:17:39.775 [INFO][3996] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" Jan 14 01:17:39.806408 containerd[1677]: 2026-01-14 01:17:39.776 [INFO][3996] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0", GenerateName:"whisker-76bdf97696-", Namespace:"calico-system", SelfLink:"", UID:"e35dc45d-4646-4773-8f3b-b3b00ec76393", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"76bdf97696", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108", Pod:"whisker-76bdf97696-8ln4x", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.92.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8bfa8f6d423", MAC:"9e:b3:99:af:83:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:39.806498 containerd[1677]: 2026-01-14 01:17:39.801 [INFO][3996] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" Namespace="calico-system" Pod="whisker-76bdf97696-8ln4x" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-whisker--76bdf97696--8ln4x-eth0" Jan 14 01:17:39.875869 containerd[1677]: time="2026-01-14T01:17:39.875794513Z" level=info msg="connecting to shim 23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108" address="unix:///run/containerd/s/1dc996ec8c9f91d24d5f92c8b4e530a7d5984b2d84321ff84599ac319a836844" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:39.930515 systemd[1]: Started cri-containerd-23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108.scope - libcontainer container 23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108. Jan 14 01:17:39.954000 audit: BPF prog-id=175 op=LOAD Jan 14 01:17:39.955000 audit: BPF prog-id=176 op=LOAD Jan 14 01:17:39.955000 audit[4088]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4066 pid=4088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:39.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233623132633865303864653865346165636135636261636639643431 Jan 14 01:17:39.955000 audit: BPF prog-id=176 op=UNLOAD Jan 14 01:17:39.955000 audit[4088]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4066 pid=4088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:39.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233623132633865303864653865346165636135636261636639643431 Jan 14 01:17:39.955000 audit: BPF prog-id=177 op=LOAD Jan 14 01:17:39.955000 audit[4088]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4066 pid=4088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:39.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233623132633865303864653865346165636135636261636639643431 Jan 14 01:17:39.955000 audit: BPF prog-id=178 op=LOAD Jan 14 01:17:39.955000 audit[4088]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4066 pid=4088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:39.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233623132633865303864653865346165636135636261636639643431 Jan 14 01:17:39.955000 audit: BPF prog-id=178 op=UNLOAD Jan 14 01:17:39.955000 audit[4088]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4066 pid=4088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:39.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233623132633865303864653865346165636135636261636639643431 Jan 14 01:17:39.955000 audit: BPF prog-id=177 op=UNLOAD Jan 14 01:17:39.955000 audit[4088]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4066 pid=4088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:39.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233623132633865303864653865346165636135636261636639643431 Jan 14 01:17:39.955000 audit: BPF prog-id=179 op=LOAD Jan 14 01:17:39.955000 audit[4088]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4066 pid=4088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:39.955000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233623132633865303864653865346165636135636261636639643431 Jan 14 01:17:40.015505 containerd[1677]: time="2026-01-14T01:17:40.015454491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76bdf97696-8ln4x,Uid:e35dc45d-4646-4773-8f3b-b3b00ec76393,Namespace:calico-system,Attempt:0,} returns sandbox id \"23b12c8e08de8e4aeca5cbacf9d411fefe03833f4479d0517ce81aba048af108\"" Jan 14 01:17:40.018372 containerd[1677]: time="2026-01-14T01:17:40.018344319Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:17:40.129117 kubelet[2863]: I0114 01:17:40.129015 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:17:40.442914 containerd[1677]: time="2026-01-14T01:17:40.442678958Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:40.444585 containerd[1677]: time="2026-01-14T01:17:40.444478707Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:17:40.444585 containerd[1677]: time="2026-01-14T01:17:40.444517357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:40.445099 kubelet[2863]: E0114 01:17:40.444985 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:17:40.445099 kubelet[2863]: E0114 01:17:40.445085 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:17:40.451717 kubelet[2863]: E0114 01:17:40.451581 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0fc0a424dd174edd96fbff1617ff49ab,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:40.455992 containerd[1677]: time="2026-01-14T01:17:40.455947059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:17:40.929000 containerd[1677]: time="2026-01-14T01:17:40.928926625Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:40.930369 containerd[1677]: time="2026-01-14T01:17:40.930274094Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:17:40.930369 containerd[1677]: time="2026-01-14T01:17:40.930359514Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:40.931140 kubelet[2863]: E0114 01:17:40.930542 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:17:40.931140 kubelet[2863]: E0114 01:17:40.930605 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:17:40.931258 kubelet[2863]: E0114 01:17:40.930781 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:40.932105 kubelet[2863]: E0114 01:17:40.931939 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:17:40.994883 kubelet[2863]: I0114 01:17:40.994467 2863 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a68be9-46a2-4474-b3c6-09ddae1292f7" path="/var/lib/kubelet/pods/f4a68be9-46a2-4474-b3c6-09ddae1292f7/volumes" Jan 14 01:17:41.137881 kubelet[2863]: E0114 01:17:41.137625 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:17:41.190000 audit[4172]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=4172 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:41.190000 audit[4172]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffce8e9ad30 a2=0 a3=7ffce8e9ad1c items=0 ppid=3013 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:41.190000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:41.195000 audit[4172]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=4172 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:41.195000 audit[4172]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffce8e9ad30 a2=0 a3=0 items=0 ppid=3013 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:41.195000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:41.287170 systemd-networkd[1572]: cali8bfa8f6d423: Gained IPv6LL Jan 14 01:17:44.991500 containerd[1677]: time="2026-01-14T01:17:44.991255931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4f9c766-pwdc6,Uid:8ba22e1c-b895-4e68-8414-171d12dc9bef,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:44.992237 containerd[1677]: time="2026-01-14T01:17:44.991709852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7p9cs,Uid:cd931b03-7a61-4463-8de1-782ce3a3b938,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:44.992237 containerd[1677]: time="2026-01-14T01:17:44.992110932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pj96q,Uid:f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:45.177168 systemd-networkd[1572]: cali25f60380d99: Link UP Jan 14 01:17:45.182916 systemd-networkd[1572]: cali25f60380d99: Gained carrier Jan 14 01:17:45.205153 containerd[1677]: 2026-01-14 01:17:45.064 [INFO][4248] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:45.205153 containerd[1677]: 2026-01-14 01:17:45.081 [INFO][4248] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0 calico-kube-controllers-85b4f9c766- calico-system 8ba22e1c-b895-4e68-8414-171d12dc9bef 806 0 2026-01-14 01:17:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:85b4f9c766 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 calico-kube-controllers-85b4f9c766-pwdc6 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali25f60380d99 [] [] }} ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-" Jan 14 01:17:45.205153 containerd[1677]: 2026-01-14 01:17:45.082 [INFO][4248] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" Jan 14 01:17:45.205153 containerd[1677]: 2026-01-14 01:17:45.118 [INFO][4290] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" HandleID="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.119 [INFO][4290] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" HandleID="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f1c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-2c3a114250", "pod":"calico-kube-controllers-85b4f9c766-pwdc6", "timestamp":"2026-01-14 01:17:45.118923064 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.119 [INFO][4290] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.119 [INFO][4290] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.119 [INFO][4290] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.126 [INFO][4290] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.132 [INFO][4290] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.138 [INFO][4290] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.141 [INFO][4290] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205433 containerd[1677]: 2026-01-14 01:17:45.145 [INFO][4290] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205672 containerd[1677]: 2026-01-14 01:17:45.145 [INFO][4290] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205672 containerd[1677]: 2026-01-14 01:17:45.148 [INFO][4290] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73 Jan 14 01:17:45.205672 containerd[1677]: 2026-01-14 01:17:45.154 [INFO][4290] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205672 containerd[1677]: 2026-01-14 01:17:45.162 [INFO][4290] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.194/26] block=192.168.92.192/26 handle="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205672 containerd[1677]: 2026-01-14 01:17:45.162 [INFO][4290] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.194/26] handle="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.205672 containerd[1677]: 2026-01-14 01:17:45.163 [INFO][4290] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:45.205672 containerd[1677]: 2026-01-14 01:17:45.163 [INFO][4290] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.194/26] IPv6=[] ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" HandleID="k8s-pod-network.7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" Jan 14 01:17:45.205827 containerd[1677]: 2026-01-14 01:17:45.169 [INFO][4248] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0", GenerateName:"calico-kube-controllers-85b4f9c766-", Namespace:"calico-system", SelfLink:"", UID:"8ba22e1c-b895-4e68-8414-171d12dc9bef", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85b4f9c766", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"calico-kube-controllers-85b4f9c766-pwdc6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali25f60380d99", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:45.205883 containerd[1677]: 2026-01-14 01:17:45.170 [INFO][4248] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.194/32] ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" Jan 14 01:17:45.205883 containerd[1677]: 2026-01-14 01:17:45.171 [INFO][4248] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali25f60380d99 ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" Jan 14 01:17:45.205883 containerd[1677]: 2026-01-14 01:17:45.184 [INFO][4248] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" Jan 14 01:17:45.205951 containerd[1677]: 2026-01-14 01:17:45.187 [INFO][4248] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0", GenerateName:"calico-kube-controllers-85b4f9c766-", Namespace:"calico-system", SelfLink:"", UID:"8ba22e1c-b895-4e68-8414-171d12dc9bef", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85b4f9c766", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73", Pod:"calico-kube-controllers-85b4f9c766-pwdc6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali25f60380d99", MAC:"0a:8a:5b:c3:f9:d8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:45.206374 containerd[1677]: 2026-01-14 01:17:45.202 [INFO][4248] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" Namespace="calico-system" Pod="calico-kube-controllers-85b4f9c766-pwdc6" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--kube--controllers--85b4f9c766--pwdc6-eth0" Jan 14 01:17:45.230059 containerd[1677]: time="2026-01-14T01:17:45.229922994Z" level=info msg="connecting to shim 7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73" address="unix:///run/containerd/s/2140099ef3fa3d5ae055008237c0b8b5fb66a2c7ff32fcb9c03b664c7d2ff9fd" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:45.271481 systemd[1]: Started cri-containerd-7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73.scope - libcontainer container 7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73. Jan 14 01:17:45.281303 systemd-networkd[1572]: cali6f4383b743a: Link UP Jan 14 01:17:45.281546 systemd-networkd[1572]: cali6f4383b743a: Gained carrier Jan 14 01:17:45.308547 kernel: kauditd_printk_skb: 33 callbacks suppressed Jan 14 01:17:45.308800 kernel: audit: type=1334 audit(1768353465.303:592): prog-id=180 op=LOAD Jan 14 01:17:45.303000 audit: BPF prog-id=180 op=LOAD Jan 14 01:17:45.308000 audit: BPF prog-id=181 op=LOAD Jan 14 01:17:45.313100 kernel: audit: type=1334 audit(1768353465.308:593): prog-id=181 op=LOAD Jan 14 01:17:45.308000 audit[4336]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.316673 containerd[1677]: 2026-01-14 01:17:45.042 [INFO][4255] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:45.316673 containerd[1677]: 2026-01-14 01:17:45.057 [INFO][4255] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0 csi-node-driver- calico-system f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b 707 0 2026-01-14 01:17:24 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 csi-node-driver-pj96q eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6f4383b743a [] [] }} ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-" Jan 14 01:17:45.316673 containerd[1677]: 2026-01-14 01:17:45.057 [INFO][4255] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" Jan 14 01:17:45.316673 containerd[1677]: 2026-01-14 01:17:45.133 [INFO][4281] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" HandleID="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Workload="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.133 [INFO][4281] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" HandleID="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Workload="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-2c3a114250", "pod":"csi-node-driver-pj96q", "timestamp":"2026-01-14 01:17:45.133524259 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.133 [INFO][4281] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.163 [INFO][4281] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.163 [INFO][4281] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.227 [INFO][4281] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.234 [INFO][4281] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.240 [INFO][4281] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.242 [INFO][4281] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.316869 containerd[1677]: 2026-01-14 01:17:45.246 [INFO][4281] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.317549 containerd[1677]: 2026-01-14 01:17:45.246 [INFO][4281] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.317549 containerd[1677]: 2026-01-14 01:17:45.248 [INFO][4281] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b Jan 14 01:17:45.317549 containerd[1677]: 2026-01-14 01:17:45.255 [INFO][4281] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.317549 containerd[1677]: 2026-01-14 01:17:45.264 [INFO][4281] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.195/26] block=192.168.92.192/26 handle="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.317549 containerd[1677]: 2026-01-14 01:17:45.264 [INFO][4281] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.195/26] handle="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.317549 containerd[1677]: 2026-01-14 01:17:45.264 [INFO][4281] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:45.317549 containerd[1677]: 2026-01-14 01:17:45.264 [INFO][4281] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.195/26] IPv6=[] ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" HandleID="k8s-pod-network.c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Workload="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" Jan 14 01:17:45.317664 containerd[1677]: 2026-01-14 01:17:45.274 [INFO][4255] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"csi-node-driver-pj96q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.92.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6f4383b743a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:45.317708 containerd[1677]: 2026-01-14 01:17:45.274 [INFO][4255] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.195/32] ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" Jan 14 01:17:45.317708 containerd[1677]: 2026-01-14 01:17:45.274 [INFO][4255] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f4383b743a ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" Jan 14 01:17:45.317708 containerd[1677]: 2026-01-14 01:17:45.280 [INFO][4255] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" Jan 14 01:17:45.318109 containerd[1677]: 2026-01-14 01:17:45.290 [INFO][4255] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b", Pod:"csi-node-driver-pj96q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.92.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6f4383b743a", MAC:"3e:55:d9:d9:26:d1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:45.318227 containerd[1677]: 2026-01-14 01:17:45.309 [INFO][4255] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" Namespace="calico-system" Pod="csi-node-driver-pj96q" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-csi--node--driver--pj96q-eth0" Jan 14 01:17:45.321060 kernel: audit: type=1300 audit(1768353465.308:593): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.331052 kernel: audit: type=1327 audit(1768353465.308:593): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.308000 audit: BPF prog-id=181 op=UNLOAD Jan 14 01:17:45.340494 kernel: audit: type=1334 audit(1768353465.308:594): prog-id=181 op=UNLOAD Jan 14 01:17:45.340621 kernel: audit: type=1300 audit(1768353465.308:594): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.308000 audit[4336]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.347996 kernel: audit: type=1327 audit(1768353465.308:594): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.309000 audit: BPF prog-id=182 op=LOAD Jan 14 01:17:45.359182 kernel: audit: type=1334 audit(1768353465.309:595): prog-id=182 op=LOAD Jan 14 01:17:45.359300 kernel: audit: type=1300 audit(1768353465.309:595): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.309000 audit[4336]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.367469 kernel: audit: type=1327 audit(1768353465.309:595): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.309000 audit: BPF prog-id=183 op=LOAD Jan 14 01:17:45.309000 audit[4336]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.309000 audit: BPF prog-id=183 op=UNLOAD Jan 14 01:17:45.309000 audit[4336]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.309000 audit: BPF prog-id=182 op=UNLOAD Jan 14 01:17:45.309000 audit[4336]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.309000 audit: BPF prog-id=184 op=LOAD Jan 14 01:17:45.309000 audit[4336]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4324 pid=4336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763356537373334663362613037366132323663656639326333663461 Jan 14 01:17:45.375126 systemd-networkd[1572]: calid91d779bba2: Link UP Jan 14 01:17:45.376246 systemd-networkd[1572]: calid91d779bba2: Gained carrier Jan 14 01:17:45.398971 containerd[1677]: time="2026-01-14T01:17:45.398925102Z" level=info msg="connecting to shim c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b" address="unix:///run/containerd/s/f3d434d95f07cbdf52b9d3af3385810f4991904d535470af68eb32caa4fbd4fb" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:45.410165 containerd[1677]: 2026-01-14 01:17:45.078 [INFO][4256] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:45.410165 containerd[1677]: 2026-01-14 01:17:45.097 [INFO][4256] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0 coredns-674b8bbfcf- kube-system cd931b03-7a61-4463-8de1-782ce3a3b938 807 0 2026-01-14 01:17:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 coredns-674b8bbfcf-7p9cs eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid91d779bba2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-" Jan 14 01:17:45.410165 containerd[1677]: 2026-01-14 01:17:45.097 [INFO][4256] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" Jan 14 01:17:45.410165 containerd[1677]: 2026-01-14 01:17:45.151 [INFO][4296] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" HandleID="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Workload="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.151 [INFO][4296] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" HandleID="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Workload="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f220), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4578-0-0-p-2c3a114250", "pod":"coredns-674b8bbfcf-7p9cs", "timestamp":"2026-01-14 01:17:45.151697432 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.151 [INFO][4296] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.265 [INFO][4296] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.265 [INFO][4296] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.327 [INFO][4296] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.336 [INFO][4296] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.341 [INFO][4296] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.344 [INFO][4296] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410419 containerd[1677]: 2026-01-14 01:17:45.346 [INFO][4296] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410631 containerd[1677]: 2026-01-14 01:17:45.346 [INFO][4296] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410631 containerd[1677]: 2026-01-14 01:17:45.350 [INFO][4296] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244 Jan 14 01:17:45.410631 containerd[1677]: 2026-01-14 01:17:45.358 [INFO][4296] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410631 containerd[1677]: 2026-01-14 01:17:45.364 [INFO][4296] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.196/26] block=192.168.92.192/26 handle="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410631 containerd[1677]: 2026-01-14 01:17:45.364 [INFO][4296] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.196/26] handle="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:45.410631 containerd[1677]: 2026-01-14 01:17:45.364 [INFO][4296] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:45.410631 containerd[1677]: 2026-01-14 01:17:45.364 [INFO][4296] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.196/26] IPv6=[] ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" HandleID="k8s-pod-network.f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Workload="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" Jan 14 01:17:45.410823 containerd[1677]: 2026-01-14 01:17:45.368 [INFO][4256] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"cd931b03-7a61-4463-8de1-782ce3a3b938", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"coredns-674b8bbfcf-7p9cs", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid91d779bba2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:45.410823 containerd[1677]: 2026-01-14 01:17:45.369 [INFO][4256] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.196/32] ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" Jan 14 01:17:45.410823 containerd[1677]: 2026-01-14 01:17:45.369 [INFO][4256] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid91d779bba2 ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" Jan 14 01:17:45.410823 containerd[1677]: 2026-01-14 01:17:45.379 [INFO][4256] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" Jan 14 01:17:45.410823 containerd[1677]: 2026-01-14 01:17:45.383 [INFO][4256] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"cd931b03-7a61-4463-8de1-782ce3a3b938", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244", Pod:"coredns-674b8bbfcf-7p9cs", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid91d779bba2", MAC:"fa:79:5c:bf:c9:fd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:45.410823 containerd[1677]: 2026-01-14 01:17:45.404 [INFO][4256] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" Namespace="kube-system" Pod="coredns-674b8bbfcf-7p9cs" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--7p9cs-eth0" Jan 14 01:17:45.413421 containerd[1677]: time="2026-01-14T01:17:45.413377547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4f9c766-pwdc6,Uid:8ba22e1c-b895-4e68-8414-171d12dc9bef,Namespace:calico-system,Attempt:0,} returns sandbox id \"7c5e7734f3ba076a226cef92c3f4a78d4409096dbf230bdd17d9635ff9bb3f73\"" Jan 14 01:17:45.416213 containerd[1677]: time="2026-01-14T01:17:45.416103805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:17:45.441398 containerd[1677]: time="2026-01-14T01:17:45.441345387Z" level=info msg="connecting to shim f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244" address="unix:///run/containerd/s/11599ed987687ce04232313e8b04fc213182b2d23b0dfb598425662445448bef" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:45.444269 systemd[1]: Started cri-containerd-c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b.scope - libcontainer container c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b. Jan 14 01:17:45.462000 audit: BPF prog-id=185 op=LOAD Jan 14 01:17:45.463000 audit: BPF prog-id=186 op=LOAD Jan 14 01:17:45.463000 audit[4392]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4371 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6330643763343566346432343230386430373262396462656335346535 Jan 14 01:17:45.463000 audit: BPF prog-id=186 op=UNLOAD Jan 14 01:17:45.463000 audit[4392]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4371 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6330643763343566346432343230386430373262396462656335346535 Jan 14 01:17:45.463000 audit: BPF prog-id=187 op=LOAD Jan 14 01:17:45.463000 audit[4392]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4371 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6330643763343566346432343230386430373262396462656335346535 Jan 14 01:17:45.463000 audit: BPF prog-id=188 op=LOAD Jan 14 01:17:45.463000 audit[4392]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4371 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6330643763343566346432343230386430373262396462656335346535 Jan 14 01:17:45.464000 audit: BPF prog-id=188 op=UNLOAD Jan 14 01:17:45.464000 audit[4392]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4371 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.464000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6330643763343566346432343230386430373262396462656335346535 Jan 14 01:17:45.464000 audit: BPF prog-id=187 op=UNLOAD Jan 14 01:17:45.464000 audit[4392]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4371 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.464000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6330643763343566346432343230386430373262396462656335346535 Jan 14 01:17:45.464000 audit: BPF prog-id=189 op=LOAD Jan 14 01:17:45.464000 audit[4392]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4371 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.464000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6330643763343566346432343230386430373262396462656335346535 Jan 14 01:17:45.477225 systemd[1]: Started cri-containerd-f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244.scope - libcontainer container f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244. Jan 14 01:17:45.495000 audit: BPF prog-id=190 op=LOAD Jan 14 01:17:45.496000 audit: BPF prog-id=191 op=LOAD Jan 14 01:17:45.496000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4412 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353639306163653930366264643666333238663461373966633162 Jan 14 01:17:45.496000 audit: BPF prog-id=191 op=UNLOAD Jan 14 01:17:45.496000 audit[4427]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4412 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353639306163653930366264643666333238663461373966633162 Jan 14 01:17:45.496000 audit: BPF prog-id=192 op=LOAD Jan 14 01:17:45.496000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4412 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353639306163653930366264643666333238663461373966633162 Jan 14 01:17:45.496000 audit: BPF prog-id=193 op=LOAD Jan 14 01:17:45.496000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4412 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353639306163653930366264643666333238663461373966633162 Jan 14 01:17:45.496000 audit: BPF prog-id=193 op=UNLOAD Jan 14 01:17:45.496000 audit[4427]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4412 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353639306163653930366264643666333238663461373966633162 Jan 14 01:17:45.496000 audit: BPF prog-id=192 op=UNLOAD Jan 14 01:17:45.496000 audit[4427]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4412 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353639306163653930366264643666333238663461373966633162 Jan 14 01:17:45.496000 audit: BPF prog-id=194 op=LOAD Jan 14 01:17:45.496000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4412 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353639306163653930366264643666333238663461373966633162 Jan 14 01:17:45.500878 containerd[1677]: time="2026-01-14T01:17:45.500843515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pj96q,Uid:f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b,Namespace:calico-system,Attempt:0,} returns sandbox id \"c0d7c45f4d24208d072b9dbec54e5ba78835ee7f062dc5574f95a26f21909b6b\"" Jan 14 01:17:45.537050 containerd[1677]: time="2026-01-14T01:17:45.536895342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7p9cs,Uid:cd931b03-7a61-4463-8de1-782ce3a3b938,Namespace:kube-system,Attempt:0,} returns sandbox id \"f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244\"" Jan 14 01:17:45.543406 containerd[1677]: time="2026-01-14T01:17:45.543372409Z" level=info msg="CreateContainer within sandbox \"f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:17:45.558945 containerd[1677]: time="2026-01-14T01:17:45.558305253Z" level=info msg="Container f4be881df964c56e792602f745bc1cb88da244d13f33acf17f082c140d961a1a: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:45.566793 containerd[1677]: time="2026-01-14T01:17:45.566646431Z" level=info msg="CreateContainer within sandbox \"f45690ace906bdd6f328f4a79fc1bf8c29991aa46a963cb8b4f8c4c3e6cfc244\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f4be881df964c56e792602f745bc1cb88da244d13f33acf17f082c140d961a1a\"" Jan 14 01:17:45.567987 containerd[1677]: time="2026-01-14T01:17:45.567967981Z" level=info msg="StartContainer for \"f4be881df964c56e792602f745bc1cb88da244d13f33acf17f082c140d961a1a\"" Jan 14 01:17:45.569740 containerd[1677]: time="2026-01-14T01:17:45.569680919Z" level=info msg="connecting to shim f4be881df964c56e792602f745bc1cb88da244d13f33acf17f082c140d961a1a" address="unix:///run/containerd/s/11599ed987687ce04232313e8b04fc213182b2d23b0dfb598425662445448bef" protocol=ttrpc version=3 Jan 14 01:17:45.605630 systemd[1]: Started cri-containerd-f4be881df964c56e792602f745bc1cb88da244d13f33acf17f082c140d961a1a.scope - libcontainer container f4be881df964c56e792602f745bc1cb88da244d13f33acf17f082c140d961a1a. Jan 14 01:17:45.622000 audit: BPF prog-id=195 op=LOAD Jan 14 01:17:45.624000 audit: BPF prog-id=196 op=LOAD Jan 14 01:17:45.624000 audit[4476]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4412 pid=4476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634626538383164663936346335366537393236303266373435626331 Jan 14 01:17:45.624000 audit: BPF prog-id=196 op=UNLOAD Jan 14 01:17:45.624000 audit[4476]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4412 pid=4476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634626538383164663936346335366537393236303266373435626331 Jan 14 01:17:45.624000 audit: BPF prog-id=197 op=LOAD Jan 14 01:17:45.624000 audit[4476]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4412 pid=4476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634626538383164663936346335366537393236303266373435626331 Jan 14 01:17:45.624000 audit: BPF prog-id=198 op=LOAD Jan 14 01:17:45.624000 audit[4476]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4412 pid=4476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634626538383164663936346335366537393236303266373435626331 Jan 14 01:17:45.624000 audit: BPF prog-id=198 op=UNLOAD Jan 14 01:17:45.624000 audit[4476]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4412 pid=4476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634626538383164663936346335366537393236303266373435626331 Jan 14 01:17:45.624000 audit: BPF prog-id=197 op=UNLOAD Jan 14 01:17:45.624000 audit[4476]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4412 pid=4476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634626538383164663936346335366537393236303266373435626331 Jan 14 01:17:45.624000 audit: BPF prog-id=199 op=LOAD Jan 14 01:17:45.624000 audit[4476]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4412 pid=4476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:45.624000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634626538383164663936346335366537393236303266373435626331 Jan 14 01:17:45.656225 containerd[1677]: time="2026-01-14T01:17:45.656157398Z" level=info msg="StartContainer for \"f4be881df964c56e792602f745bc1cb88da244d13f33acf17f082c140d961a1a\" returns successfully" Jan 14 01:17:45.846406 containerd[1677]: time="2026-01-14T01:17:45.846245948Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:45.847925 containerd[1677]: time="2026-01-14T01:17:45.847849408Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:17:45.848748 containerd[1677]: time="2026-01-14T01:17:45.847964608Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:45.848798 kubelet[2863]: E0114 01:17:45.848204 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:17:45.848798 kubelet[2863]: E0114 01:17:45.848265 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:17:45.848798 kubelet[2863]: E0114 01:17:45.848460 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcsgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-85b4f9c766-pwdc6_calico-system(8ba22e1c-b895-4e68-8414-171d12dc9bef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:45.849509 containerd[1677]: time="2026-01-14T01:17:45.849352458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:17:45.849649 kubelet[2863]: E0114 01:17:45.849629 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:17:45.991057 containerd[1677]: time="2026-01-14T01:17:45.990892586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fq4fr,Uid:d2a5e836-30ac-41d3-8f91-c61095ba41ec,Namespace:kube-system,Attempt:0,}" Jan 14 01:17:46.104397 systemd-networkd[1572]: cali9741d149bf3: Link UP Jan 14 01:17:46.106194 systemd-networkd[1572]: cali9741d149bf3: Gained carrier Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.039 [INFO][4518] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.050 [INFO][4518] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0 coredns-674b8bbfcf- kube-system d2a5e836-30ac-41d3-8f91-c61095ba41ec 805 0 2026-01-14 01:17:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 coredns-674b8bbfcf-fq4fr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9741d149bf3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.050 [INFO][4518] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.071 [INFO][4530] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" HandleID="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Workload="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.071 [INFO][4530] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" HandleID="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Workload="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c5840), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4578-0-0-p-2c3a114250", "pod":"coredns-674b8bbfcf-fq4fr", "timestamp":"2026-01-14 01:17:46.071271919 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.071 [INFO][4530] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.071 [INFO][4530] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.071 [INFO][4530] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.078 [INFO][4530] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.082 [INFO][4530] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.085 [INFO][4530] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.087 [INFO][4530] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.088 [INFO][4530] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.089 [INFO][4530] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.090 [INFO][4530] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198 Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.093 [INFO][4530] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.098 [INFO][4530] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.197/26] block=192.168.92.192/26 handle="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.098 [INFO][4530] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.197/26] handle="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.098 [INFO][4530] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:46.120190 containerd[1677]: 2026-01-14 01:17:46.098 [INFO][4530] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.197/26] IPv6=[] ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" HandleID="k8s-pod-network.40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Workload="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" Jan 14 01:17:46.122418 containerd[1677]: 2026-01-14 01:17:46.101 [INFO][4518] cni-plugin/k8s.go 418: Populated endpoint ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d2a5e836-30ac-41d3-8f91-c61095ba41ec", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"coredns-674b8bbfcf-fq4fr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9741d149bf3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:46.122418 containerd[1677]: 2026-01-14 01:17:46.101 [INFO][4518] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.197/32] ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" Jan 14 01:17:46.122418 containerd[1677]: 2026-01-14 01:17:46.101 [INFO][4518] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9741d149bf3 ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" Jan 14 01:17:46.122418 containerd[1677]: 2026-01-14 01:17:46.106 [INFO][4518] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" Jan 14 01:17:46.122418 containerd[1677]: 2026-01-14 01:17:46.107 [INFO][4518] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d2a5e836-30ac-41d3-8f91-c61095ba41ec", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198", Pod:"coredns-674b8bbfcf-fq4fr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9741d149bf3", MAC:"82:6b:a3:35:47:25", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:46.122418 containerd[1677]: 2026-01-14 01:17:46.115 [INFO][4518] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" Namespace="kube-system" Pod="coredns-674b8bbfcf-fq4fr" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-coredns--674b8bbfcf--fq4fr-eth0" Jan 14 01:17:46.152036 kubelet[2863]: E0114 01:17:46.151732 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:17:46.154975 containerd[1677]: time="2026-01-14T01:17:46.154528723Z" level=info msg="connecting to shim 40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198" address="unix:///run/containerd/s/b3af86c8409c07c071d38ab13ac0d990f2d2d0b7371f6f09dcd10810a293bf97" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:46.192248 systemd[1]: Started cri-containerd-40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198.scope - libcontainer container 40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198. Jan 14 01:17:46.202096 kubelet[2863]: I0114 01:17:46.201261 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-7p9cs" podStartSLOduration=36.201232378 podStartE2EDuration="36.201232378s" podCreationTimestamp="2026-01-14 01:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:17:46.201157648 +0000 UTC m=+41.336076901" watchObservedRunningTime="2026-01-14 01:17:46.201232378 +0000 UTC m=+41.336151641" Jan 14 01:17:46.215000 audit: BPF prog-id=200 op=LOAD Jan 14 01:17:46.215000 audit: BPF prog-id=201 op=LOAD Jan 14 01:17:46.215000 audit[4562]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4551 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.215000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430623765646333663335643566626161303366353666373565333431 Jan 14 01:17:46.216000 audit: BPF prog-id=201 op=UNLOAD Jan 14 01:17:46.216000 audit[4562]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4551 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430623765646333663335643566626161303366353666373565333431 Jan 14 01:17:46.216000 audit: BPF prog-id=202 op=LOAD Jan 14 01:17:46.216000 audit[4562]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4551 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430623765646333663335643566626161303366353666373565333431 Jan 14 01:17:46.216000 audit: BPF prog-id=203 op=LOAD Jan 14 01:17:46.216000 audit[4562]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4551 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430623765646333663335643566626161303366353666373565333431 Jan 14 01:17:46.216000 audit: BPF prog-id=203 op=UNLOAD Jan 14 01:17:46.216000 audit[4562]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4551 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430623765646333663335643566626161303366353666373565333431 Jan 14 01:17:46.216000 audit: BPF prog-id=202 op=UNLOAD Jan 14 01:17:46.216000 audit[4562]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4551 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430623765646333663335643566626161303366353666373565333431 Jan 14 01:17:46.216000 audit: BPF prog-id=204 op=LOAD Jan 14 01:17:46.216000 audit[4562]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4551 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430623765646333663335643566626161303366353666373565333431 Jan 14 01:17:46.225000 audit[4583]: NETFILTER_CFG table=filter:119 family=2 entries=19 op=nft_register_rule pid=4583 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:46.225000 audit[4583]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fffc1d4f7f0 a2=0 a3=7fffc1d4f7dc items=0 ppid=3013 pid=4583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.225000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:46.231000 audit[4583]: NETFILTER_CFG table=nat:120 family=2 entries=33 op=nft_register_chain pid=4583 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:46.231000 audit[4583]: SYSCALL arch=c000003e syscall=46 success=yes exit=13428 a0=3 a1=7fffc1d4f7f0 a2=0 a3=7fffc1d4f7dc items=0 ppid=3013 pid=4583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.231000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:46.254455 containerd[1677]: time="2026-01-14T01:17:46.254379571Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fq4fr,Uid:d2a5e836-30ac-41d3-8f91-c61095ba41ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198\"" Jan 14 01:17:46.259630 containerd[1677]: time="2026-01-14T01:17:46.259454300Z" level=info msg="CreateContainer within sandbox \"40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:17:46.271817 containerd[1677]: time="2026-01-14T01:17:46.271770806Z" level=info msg="Container d93ae146f546b1f5cfcb86ffc2e82e23448f13d739868695169993eb50d3db11: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:17:46.277267 containerd[1677]: time="2026-01-14T01:17:46.277160754Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:46.277267 containerd[1677]: time="2026-01-14T01:17:46.277211564Z" level=info msg="CreateContainer within sandbox \"40b7edc3f35d5fbaa03f56f75e34131e1b275ee3d938b5a42f3ffd527ab71198\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d93ae146f546b1f5cfcb86ffc2e82e23448f13d739868695169993eb50d3db11\"" Jan 14 01:17:46.277965 containerd[1677]: time="2026-01-14T01:17:46.277811404Z" level=info msg="StartContainer for \"d93ae146f546b1f5cfcb86ffc2e82e23448f13d739868695169993eb50d3db11\"" Jan 14 01:17:46.278063 containerd[1677]: time="2026-01-14T01:17:46.278046964Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:17:46.278139 containerd[1677]: time="2026-01-14T01:17:46.278129964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:46.279055 kubelet[2863]: E0114 01:17:46.278792 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:17:46.279112 containerd[1677]: time="2026-01-14T01:17:46.278932654Z" level=info msg="connecting to shim d93ae146f546b1f5cfcb86ffc2e82e23448f13d739868695169993eb50d3db11" address="unix:///run/containerd/s/b3af86c8409c07c071d38ab13ac0d990f2d2d0b7371f6f09dcd10810a293bf97" protocol=ttrpc version=3 Jan 14 01:17:46.279242 kubelet[2863]: E0114 01:17:46.279206 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:17:46.280422 kubelet[2863]: E0114 01:17:46.280375 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:46.284136 containerd[1677]: time="2026-01-14T01:17:46.283891032Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:17:46.302145 systemd[1]: Started cri-containerd-d93ae146f546b1f5cfcb86ffc2e82e23448f13d739868695169993eb50d3db11.scope - libcontainer container d93ae146f546b1f5cfcb86ffc2e82e23448f13d739868695169993eb50d3db11. Jan 14 01:17:46.312000 audit: BPF prog-id=205 op=LOAD Jan 14 01:17:46.313000 audit: BPF prog-id=206 op=LOAD Jan 14 01:17:46.313000 audit[4592]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4551 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439336165313436663534366231663563666362383666666332653832 Jan 14 01:17:46.313000 audit: BPF prog-id=206 op=UNLOAD Jan 14 01:17:46.313000 audit[4592]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4551 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439336165313436663534366231663563666362383666666332653832 Jan 14 01:17:46.313000 audit: BPF prog-id=207 op=LOAD Jan 14 01:17:46.313000 audit[4592]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4551 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439336165313436663534366231663563666362383666666332653832 Jan 14 01:17:46.313000 audit: BPF prog-id=208 op=LOAD Jan 14 01:17:46.313000 audit[4592]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4551 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439336165313436663534366231663563666362383666666332653832 Jan 14 01:17:46.313000 audit: BPF prog-id=208 op=UNLOAD Jan 14 01:17:46.313000 audit[4592]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4551 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439336165313436663534366231663563666362383666666332653832 Jan 14 01:17:46.313000 audit: BPF prog-id=207 op=UNLOAD Jan 14 01:17:46.313000 audit[4592]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4551 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.313000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439336165313436663534366231663563666362383666666332653832 Jan 14 01:17:46.314000 audit: BPF prog-id=209 op=LOAD Jan 14 01:17:46.314000 audit[4592]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4551 pid=4592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:46.314000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439336165313436663534366231663563666362383666666332653832 Jan 14 01:17:46.332371 containerd[1677]: time="2026-01-14T01:17:46.332319217Z" level=info msg="StartContainer for \"d93ae146f546b1f5cfcb86ffc2e82e23448f13d739868695169993eb50d3db11\" returns successfully" Jan 14 01:17:46.535432 systemd-networkd[1572]: calid91d779bba2: Gained IPv6LL Jan 14 01:17:46.716130 containerd[1677]: time="2026-01-14T01:17:46.715555805Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:46.718314 containerd[1677]: time="2026-01-14T01:17:46.718186245Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:17:46.719086 kubelet[2863]: E0114 01:17:46.718858 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:17:46.719086 kubelet[2863]: E0114 01:17:46.718917 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:17:46.719256 containerd[1677]: time="2026-01-14T01:17:46.718260015Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:46.719863 kubelet[2863]: E0114 01:17:46.719783 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:46.721315 kubelet[2863]: E0114 01:17:46.721233 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:47.113168 systemd-networkd[1572]: cali25f60380d99: Gained IPv6LL Jan 14 01:17:47.176556 kubelet[2863]: E0114 01:17:47.176455 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:17:47.177962 kubelet[2863]: E0114 01:17:47.177899 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:17:47.231278 kubelet[2863]: I0114 01:17:47.231191 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-fq4fr" podStartSLOduration=37.231168283 podStartE2EDuration="37.231168283s" podCreationTimestamp="2026-01-14 01:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:17:47.230425033 +0000 UTC m=+42.365344336" watchObservedRunningTime="2026-01-14 01:17:47.231168283 +0000 UTC m=+42.366087576" Jan 14 01:17:47.239327 systemd-networkd[1572]: cali6f4383b743a: Gained IPv6LL Jan 14 01:17:47.250000 audit[4648]: NETFILTER_CFG table=filter:121 family=2 entries=16 op=nft_register_rule pid=4648 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:47.250000 audit[4648]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fff352d8f00 a2=0 a3=7fff352d8eec items=0 ppid=3013 pid=4648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:47.250000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:47.264000 audit[4648]: NETFILTER_CFG table=nat:122 family=2 entries=54 op=nft_register_chain pid=4648 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:47.264000 audit[4648]: SYSCALL arch=c000003e syscall=46 success=yes exit=19092 a0=3 a1=7fff352d8f00 a2=0 a3=7fff352d8eec items=0 ppid=3013 pid=4648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:47.264000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:48.135176 systemd-networkd[1572]: cali9741d149bf3: Gained IPv6LL Jan 14 01:17:48.995335 containerd[1677]: time="2026-01-14T01:17:48.995276778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-l6m4k,Uid:a11f31d9-176c-489f-9754-8429b5bd5389,Namespace:calico-system,Attempt:0,}" Jan 14 01:17:48.997387 containerd[1677]: time="2026-01-14T01:17:48.996368817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-dllbx,Uid:0c40b23d-843d-4125-8dd6-0c99d44bb1dc,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:17:48.997387 containerd[1677]: time="2026-01-14T01:17:48.996955107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-w8fr4,Uid:e9b7dca9-0cc9-40e4-b746-163e923a9fd3,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:17:49.222863 systemd-networkd[1572]: caliaeb61cde2ad: Link UP Jan 14 01:17:49.223870 systemd-networkd[1572]: caliaeb61cde2ad: Gained carrier Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.102 [INFO][4684] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.120 [INFO][4684] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0 goldmane-666569f655- calico-system a11f31d9-176c-489f-9754-8429b5bd5389 810 0 2026-01-14 01:17:22 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 goldmane-666569f655-l6m4k eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] caliaeb61cde2ad [] [] }} ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.120 [INFO][4684] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.182 [INFO][4730] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" HandleID="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Workload="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.182 [INFO][4730] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" HandleID="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Workload="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f580), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-2c3a114250", "pod":"goldmane-666569f655-l6m4k", "timestamp":"2026-01-14 01:17:49.182386272 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.183 [INFO][4730] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.183 [INFO][4730] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.183 [INFO][4730] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.191 [INFO][4730] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.197 [INFO][4730] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.202 [INFO][4730] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.203 [INFO][4730] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.206 [INFO][4730] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.206 [INFO][4730] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.207 [INFO][4730] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508 Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.211 [INFO][4730] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.216 [INFO][4730] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.198/26] block=192.168.92.192/26 handle="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.216 [INFO][4730] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.198/26] handle="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.216 [INFO][4730] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:49.237604 containerd[1677]: 2026-01-14 01:17:49.216 [INFO][4730] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.198/26] IPv6=[] ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" HandleID="k8s-pod-network.3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Workload="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" Jan 14 01:17:49.238309 containerd[1677]: 2026-01-14 01:17:49.219 [INFO][4684] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"a11f31d9-176c-489f-9754-8429b5bd5389", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"goldmane-666569f655-l6m4k", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.92.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliaeb61cde2ad", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:49.238309 containerd[1677]: 2026-01-14 01:17:49.219 [INFO][4684] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.198/32] ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" Jan 14 01:17:49.238309 containerd[1677]: 2026-01-14 01:17:49.219 [INFO][4684] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaeb61cde2ad ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" Jan 14 01:17:49.238309 containerd[1677]: 2026-01-14 01:17:49.222 [INFO][4684] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" Jan 14 01:17:49.238309 containerd[1677]: 2026-01-14 01:17:49.222 [INFO][4684] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"a11f31d9-176c-489f-9754-8429b5bd5389", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508", Pod:"goldmane-666569f655-l6m4k", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.92.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliaeb61cde2ad", MAC:"4e:a9:84:20:cd:90", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:49.238309 containerd[1677]: 2026-01-14 01:17:49.231 [INFO][4684] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" Namespace="calico-system" Pod="goldmane-666569f655-l6m4k" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-goldmane--666569f655--l6m4k-eth0" Jan 14 01:17:49.261998 containerd[1677]: time="2026-01-14T01:17:49.261900997Z" level=info msg="connecting to shim 3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508" address="unix:///run/containerd/s/06830b18e72e5e8b503558bf970499c066ec67209b8aea85aa821ccad0e39dbb" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:49.287239 systemd[1]: Started cri-containerd-3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508.scope - libcontainer container 3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508. Jan 14 01:17:49.303000 audit: BPF prog-id=210 op=LOAD Jan 14 01:17:49.304000 audit: BPF prog-id=211 op=LOAD Jan 14 01:17:49.304000 audit[4774]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4763 pid=4774 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363386165313332353261373832626165623537343765613137633037 Jan 14 01:17:49.304000 audit: BPF prog-id=211 op=UNLOAD Jan 14 01:17:49.304000 audit[4774]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4763 pid=4774 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363386165313332353261373832626165623537343765613137633037 Jan 14 01:17:49.304000 audit: BPF prog-id=212 op=LOAD Jan 14 01:17:49.304000 audit[4774]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4763 pid=4774 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363386165313332353261373832626165623537343765613137633037 Jan 14 01:17:49.304000 audit: BPF prog-id=213 op=LOAD Jan 14 01:17:49.304000 audit[4774]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4763 pid=4774 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363386165313332353261373832626165623537343765613137633037 Jan 14 01:17:49.304000 audit: BPF prog-id=213 op=UNLOAD Jan 14 01:17:49.304000 audit[4774]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4763 pid=4774 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363386165313332353261373832626165623537343765613137633037 Jan 14 01:17:49.304000 audit: BPF prog-id=212 op=UNLOAD Jan 14 01:17:49.304000 audit[4774]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4763 pid=4774 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363386165313332353261373832626165623537343765613137633037 Jan 14 01:17:49.304000 audit: BPF prog-id=214 op=LOAD Jan 14 01:17:49.304000 audit[4774]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4763 pid=4774 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363386165313332353261373832626165623537343765613137633037 Jan 14 01:17:49.324352 systemd-networkd[1572]: cali3d26bf9e6cf: Link UP Jan 14 01:17:49.324990 systemd-networkd[1572]: cali3d26bf9e6cf: Gained carrier Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.110 [INFO][4696] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.130 [INFO][4696] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0 calico-apiserver-75bdffd95c- calico-apiserver 0c40b23d-843d-4125-8dd6-0c99d44bb1dc 809 0 2026-01-14 01:17:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75bdffd95c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 calico-apiserver-75bdffd95c-dllbx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3d26bf9e6cf [] [] }} ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.130 [INFO][4696] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.191 [INFO][4733] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" HandleID="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.191 [INFO][4733] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" HandleID="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f7b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4578-0-0-p-2c3a114250", "pod":"calico-apiserver-75bdffd95c-dllbx", "timestamp":"2026-01-14 01:17:49.19104156 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.191 [INFO][4733] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.216 [INFO][4733] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.217 [INFO][4733] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.291 [INFO][4733] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.297 [INFO][4733] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.301 [INFO][4733] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.303 [INFO][4733] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.306 [INFO][4733] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.306 [INFO][4733] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.307 [INFO][4733] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.311 [INFO][4733] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.317 [INFO][4733] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.199/26] block=192.168.92.192/26 handle="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.317 [INFO][4733] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.199/26] handle="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.317 [INFO][4733] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:49.343711 containerd[1677]: 2026-01-14 01:17:49.317 [INFO][4733] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.199/26] IPv6=[] ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" HandleID="k8s-pod-network.f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" Jan 14 01:17:49.344237 containerd[1677]: 2026-01-14 01:17:49.321 [INFO][4696] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0", GenerateName:"calico-apiserver-75bdffd95c-", Namespace:"calico-apiserver", SelfLink:"", UID:"0c40b23d-843d-4125-8dd6-0c99d44bb1dc", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bdffd95c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"calico-apiserver-75bdffd95c-dllbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3d26bf9e6cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:49.344237 containerd[1677]: 2026-01-14 01:17:49.321 [INFO][4696] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.199/32] ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" Jan 14 01:17:49.344237 containerd[1677]: 2026-01-14 01:17:49.321 [INFO][4696] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3d26bf9e6cf ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" Jan 14 01:17:49.344237 containerd[1677]: 2026-01-14 01:17:49.325 [INFO][4696] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" Jan 14 01:17:49.344237 containerd[1677]: 2026-01-14 01:17:49.327 [INFO][4696] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0", GenerateName:"calico-apiserver-75bdffd95c-", Namespace:"calico-apiserver", SelfLink:"", UID:"0c40b23d-843d-4125-8dd6-0c99d44bb1dc", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bdffd95c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab", Pod:"calico-apiserver-75bdffd95c-dllbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3d26bf9e6cf", MAC:"6e:f9:7c:85:ed:9b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:49.344237 containerd[1677]: 2026-01-14 01:17:49.339 [INFO][4696] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-dllbx" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--dllbx-eth0" Jan 14 01:17:49.353113 containerd[1677]: time="2026-01-14T01:17:49.353068010Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-l6m4k,Uid:a11f31d9-176c-489f-9754-8429b5bd5389,Namespace:calico-system,Attempt:0,} returns sandbox id \"3c8ae13252a782baeb5747ea17c072853435f68ec48ccf2eb49983f9a3f1d508\"" Jan 14 01:17:49.355157 containerd[1677]: time="2026-01-14T01:17:49.355141600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:17:49.368800 containerd[1677]: time="2026-01-14T01:17:49.368750327Z" level=info msg="connecting to shim f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab" address="unix:///run/containerd/s/339d1f98a581dd16ec7132d1addbc03eeccf82350881e6d84e418be63a2eb481" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:49.391160 systemd[1]: Started cri-containerd-f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab.scope - libcontainer container f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab. Jan 14 01:17:49.404000 audit: BPF prog-id=215 op=LOAD Jan 14 01:17:49.405000 audit: BPF prog-id=216 op=LOAD Jan 14 01:17:49.405000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4813 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313531613537663165386237396562653366313231396539336132 Jan 14 01:17:49.405000 audit: BPF prog-id=216 op=UNLOAD Jan 14 01:17:49.405000 audit[4824]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4813 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313531613537663165386237396562653366313231396539336132 Jan 14 01:17:49.405000 audit: BPF prog-id=217 op=LOAD Jan 14 01:17:49.405000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4813 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313531613537663165386237396562653366313231396539336132 Jan 14 01:17:49.405000 audit: BPF prog-id=218 op=LOAD Jan 14 01:17:49.405000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4813 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313531613537663165386237396562653366313231396539336132 Jan 14 01:17:49.405000 audit: BPF prog-id=218 op=UNLOAD Jan 14 01:17:49.405000 audit[4824]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4813 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313531613537663165386237396562653366313231396539336132 Jan 14 01:17:49.405000 audit: BPF prog-id=217 op=UNLOAD Jan 14 01:17:49.405000 audit[4824]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4813 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313531613537663165386237396562653366313231396539336132 Jan 14 01:17:49.405000 audit: BPF prog-id=219 op=LOAD Jan 14 01:17:49.405000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4813 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313531613537663165386237396562653366313231396539336132 Jan 14 01:17:49.437200 systemd-networkd[1572]: calie217cd8b633: Link UP Jan 14 01:17:49.440893 systemd-networkd[1572]: calie217cd8b633: Gained carrier Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.125 [INFO][4697] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.144 [INFO][4697] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0 calico-apiserver-75bdffd95c- calico-apiserver e9b7dca9-0cc9-40e4-b746-163e923a9fd3 808 0 2026-01-14 01:17:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75bdffd95c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4578-0-0-p-2c3a114250 calico-apiserver-75bdffd95c-w8fr4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie217cd8b633 [] [] }} ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.144 [INFO][4697] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.204 [INFO][4741] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" HandleID="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.204 [INFO][4741] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" HandleID="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f6d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4578-0-0-p-2c3a114250", "pod":"calico-apiserver-75bdffd95c-w8fr4", "timestamp":"2026-01-14 01:17:49.204135278 +0000 UTC"}, Hostname:"ci-4578-0-0-p-2c3a114250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.204 [INFO][4741] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.318 [INFO][4741] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.318 [INFO][4741] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-2c3a114250' Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.392 [INFO][4741] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.398 [INFO][4741] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.402 [INFO][4741] ipam/ipam.go 511: Trying affinity for 192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.407 [INFO][4741] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.409 [INFO][4741] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.192/26 host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.409 [INFO][4741] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.192/26 handle="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.411 [INFO][4741] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35 Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.415 [INFO][4741] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.192/26 handle="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.421 [INFO][4741] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.200/26] block=192.168.92.192/26 handle="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.421 [INFO][4741] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.200/26] handle="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" host="ci-4578-0-0-p-2c3a114250" Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.421 [INFO][4741] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:17:49.464087 containerd[1677]: 2026-01-14 01:17:49.421 [INFO][4741] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.200/26] IPv6=[] ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" HandleID="k8s-pod-network.fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Workload="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" Jan 14 01:17:49.464718 containerd[1677]: 2026-01-14 01:17:49.426 [INFO][4697] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0", GenerateName:"calico-apiserver-75bdffd95c-", Namespace:"calico-apiserver", SelfLink:"", UID:"e9b7dca9-0cc9-40e4-b746-163e923a9fd3", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bdffd95c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"", Pod:"calico-apiserver-75bdffd95c-w8fr4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie217cd8b633", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:49.464718 containerd[1677]: 2026-01-14 01:17:49.427 [INFO][4697] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.200/32] ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" Jan 14 01:17:49.464718 containerd[1677]: 2026-01-14 01:17:49.427 [INFO][4697] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie217cd8b633 ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" Jan 14 01:17:49.464718 containerd[1677]: 2026-01-14 01:17:49.441 [INFO][4697] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" Jan 14 01:17:49.464718 containerd[1677]: 2026-01-14 01:17:49.444 [INFO][4697] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0", GenerateName:"calico-apiserver-75bdffd95c-", Namespace:"calico-apiserver", SelfLink:"", UID:"e9b7dca9-0cc9-40e4-b746-163e923a9fd3", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 17, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bdffd95c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-2c3a114250", ContainerID:"fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35", Pod:"calico-apiserver-75bdffd95c-w8fr4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie217cd8b633", MAC:"42:f8:9b:e8:81:c5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:17:49.464718 containerd[1677]: 2026-01-14 01:17:49.461 [INFO][4697] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" Namespace="calico-apiserver" Pod="calico-apiserver-75bdffd95c-w8fr4" WorkloadEndpoint="ci--4578--0--0--p--2c3a114250-k8s-calico--apiserver--75bdffd95c--w8fr4-eth0" Jan 14 01:17:49.467663 containerd[1677]: time="2026-01-14T01:17:49.467170208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-dllbx,Uid:0c40b23d-843d-4125-8dd6-0c99d44bb1dc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f8151a57f1e8b79ebe3f1219e93a2abdf730c045974f2d15f81b6a2488d98bab\"" Jan 14 01:17:49.489573 containerd[1677]: time="2026-01-14T01:17:49.489521995Z" level=info msg="connecting to shim fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35" address="unix:///run/containerd/s/135bf22516fa3096034391c60d4090c865472c492bf96ab54548c30bd7b18379" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:17:49.515288 systemd[1]: Started cri-containerd-fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35.scope - libcontainer container fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35. Jan 14 01:17:49.526000 audit: BPF prog-id=220 op=LOAD Jan 14 01:17:49.527000 audit: BPF prog-id=221 op=LOAD Jan 14 01:17:49.527000 audit[4875]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4864 pid=4875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663323762613866373739333338656530353663653534633432383239 Jan 14 01:17:49.527000 audit: BPF prog-id=221 op=UNLOAD Jan 14 01:17:49.527000 audit[4875]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4864 pid=4875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663323762613866373739333338656530353663653534633432383239 Jan 14 01:17:49.527000 audit: BPF prog-id=222 op=LOAD Jan 14 01:17:49.527000 audit[4875]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4864 pid=4875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663323762613866373739333338656530353663653534633432383239 Jan 14 01:17:49.527000 audit: BPF prog-id=223 op=LOAD Jan 14 01:17:49.527000 audit[4875]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4864 pid=4875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663323762613866373739333338656530353663653534633432383239 Jan 14 01:17:49.527000 audit: BPF prog-id=223 op=UNLOAD Jan 14 01:17:49.527000 audit[4875]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4864 pid=4875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663323762613866373739333338656530353663653534633432383239 Jan 14 01:17:49.527000 audit: BPF prog-id=222 op=UNLOAD Jan 14 01:17:49.527000 audit[4875]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4864 pid=4875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663323762613866373739333338656530353663653534633432383239 Jan 14 01:17:49.527000 audit: BPF prog-id=224 op=LOAD Jan 14 01:17:49.527000 audit[4875]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4864 pid=4875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:49.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663323762613866373739333338656530353663653534633432383239 Jan 14 01:17:49.564090 containerd[1677]: time="2026-01-14T01:17:49.564049021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bdffd95c-w8fr4,Uid:e9b7dca9-0cc9-40e4-b746-163e923a9fd3,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fc27ba8f779338ee056ce54c428291a0a8c93b5d0f54bd2ca36dd1d828ff5a35\"" Jan 14 01:17:49.761441 containerd[1677]: time="2026-01-14T01:17:49.761294594Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:49.762859 containerd[1677]: time="2026-01-14T01:17:49.762776774Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:17:49.762993 containerd[1677]: time="2026-01-14T01:17:49.762965063Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:49.763387 kubelet[2863]: E0114 01:17:49.763260 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:17:49.763387 kubelet[2863]: E0114 01:17:49.763324 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:17:49.767155 containerd[1677]: time="2026-01-14T01:17:49.765063563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:17:49.767329 kubelet[2863]: E0114 01:17:49.765218 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snjlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-l6m4k_calico-system(a11f31d9-176c-489f-9754-8429b5bd5389): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:49.768291 kubelet[2863]: E0114 01:17:49.768098 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:17:50.177058 containerd[1677]: time="2026-01-14T01:17:50.176338883Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:50.178384 containerd[1677]: time="2026-01-14T01:17:50.178176873Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:17:50.178384 containerd[1677]: time="2026-01-14T01:17:50.178291643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:50.178565 kubelet[2863]: E0114 01:17:50.178498 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:17:50.178653 kubelet[2863]: E0114 01:17:50.178576 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:17:50.181245 containerd[1677]: time="2026-01-14T01:17:50.181191162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:17:50.182569 kubelet[2863]: E0114 01:17:50.182502 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncwmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-dllbx_calico-apiserver(0c40b23d-843d-4125-8dd6-0c99d44bb1dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:50.184181 kubelet[2863]: E0114 01:17:50.184120 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:17:50.196589 kubelet[2863]: E0114 01:17:50.196522 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:17:50.203419 kubelet[2863]: E0114 01:17:50.203235 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:17:50.251000 audit[4914]: NETFILTER_CFG table=filter:123 family=2 entries=16 op=nft_register_rule pid=4914 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:50.251000 audit[4914]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffedc16ff50 a2=0 a3=7ffedc16ff3c items=0 ppid=3013 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:50.251000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:50.257000 audit[4914]: NETFILTER_CFG table=nat:124 family=2 entries=18 op=nft_register_rule pid=4914 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:50.257000 audit[4914]: SYSCALL arch=c000003e syscall=46 success=yes exit=5004 a0=3 a1=7ffedc16ff50 a2=0 a3=0 items=0 ppid=3013 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:50.257000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:50.282000 audit[4920]: NETFILTER_CFG table=filter:125 family=2 entries=16 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:50.282000 audit[4920]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fff4c536cd0 a2=0 a3=7fff4c536cbc items=0 ppid=3013 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:50.282000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:50.287000 audit[4920]: NETFILTER_CFG table=nat:126 family=2 entries=18 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:50.287000 audit[4920]: SYSCALL arch=c000003e syscall=46 success=yes exit=5004 a0=3 a1=7fff4c536cd0 a2=0 a3=0 items=0 ppid=3013 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:50.287000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:50.612668 containerd[1677]: time="2026-01-14T01:17:50.612592028Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:50.614258 containerd[1677]: time="2026-01-14T01:17:50.614189727Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:17:50.614345 containerd[1677]: time="2026-01-14T01:17:50.614308667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:50.614625 kubelet[2863]: E0114 01:17:50.614553 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:17:50.615074 kubelet[2863]: E0114 01:17:50.614622 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:17:50.615074 kubelet[2863]: E0114 01:17:50.614836 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlp4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-w8fr4_calico-apiserver(e9b7dca9-0cc9-40e4-b746-163e923a9fd3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:50.616548 kubelet[2863]: E0114 01:17:50.616481 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:17:50.696199 systemd-networkd[1572]: caliaeb61cde2ad: Gained IPv6LL Jan 14 01:17:51.015374 systemd-networkd[1572]: calie217cd8b633: Gained IPv6LL Jan 14 01:17:51.204343 kubelet[2863]: E0114 01:17:51.204271 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:17:51.205703 kubelet[2863]: E0114 01:17:51.205402 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:17:51.205703 kubelet[2863]: E0114 01:17:51.205450 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:17:51.272317 systemd-networkd[1572]: cali3d26bf9e6cf: Gained IPv6LL Jan 14 01:17:51.277000 audit[4931]: NETFILTER_CFG table=filter:127 family=2 entries=16 op=nft_register_rule pid=4931 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:51.280045 kernel: kauditd_printk_skb: 212 callbacks suppressed Jan 14 01:17:51.280150 kernel: audit: type=1325 audit(1768353471.277:672): table=filter:127 family=2 entries=16 op=nft_register_rule pid=4931 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:51.290113 kernel: audit: type=1300 audit(1768353471.277:672): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffde52cda80 a2=0 a3=7ffde52cda6c items=0 ppid=3013 pid=4931 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.277000 audit[4931]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffde52cda80 a2=0 a3=7ffde52cda6c items=0 ppid=3013 pid=4931 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.277000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:51.302164 kernel: audit: type=1327 audit(1768353471.277:672): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:51.308000 audit[4931]: NETFILTER_CFG table=nat:128 family=2 entries=18 op=nft_register_rule pid=4931 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:51.320910 kernel: audit: type=1325 audit(1768353471.308:673): table=nat:128 family=2 entries=18 op=nft_register_rule pid=4931 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:51.321030 kernel: audit: type=1300 audit(1768353471.308:673): arch=c000003e syscall=46 success=yes exit=5004 a0=3 a1=7ffde52cda80 a2=0 a3=0 items=0 ppid=3013 pid=4931 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.308000 audit[4931]: SYSCALL arch=c000003e syscall=46 success=yes exit=5004 a0=3 a1=7ffde52cda80 a2=0 a3=0 items=0 ppid=3013 pid=4931 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.308000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:51.326029 kernel: audit: type=1327 audit(1768353471.308:673): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:51.382723 kubelet[2863]: I0114 01:17:51.382650 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:17:51.627000 audit: BPF prog-id=225 op=LOAD Jan 14 01:17:51.636543 kernel: audit: type=1334 audit(1768353471.627:674): prog-id=225 op=LOAD Jan 14 01:17:51.636648 kernel: audit: type=1300 audit(1768353471.627:674): arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd7bf5e250 a2=98 a3=1fffffffffffffff items=0 ppid=4969 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit[4985]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd7bf5e250 a2=98 a3=1fffffffffffffff items=0 ppid=4969 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.642558 kernel: audit: type=1327 audit(1768353471.627:674): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:17:51.627000 audit: BPF prog-id=225 op=UNLOAD Jan 14 01:17:51.627000 audit[4985]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd7bf5e220 a3=0 items=0 ppid=4969 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:17:51.627000 audit: BPF prog-id=226 op=LOAD Jan 14 01:17:51.627000 audit[4985]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd7bf5e130 a2=94 a3=3 items=0 ppid=4969 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:17:51.627000 audit: BPF prog-id=226 op=UNLOAD Jan 14 01:17:51.627000 audit[4985]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd7bf5e130 a2=94 a3=3 items=0 ppid=4969 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.646045 kernel: audit: type=1334 audit(1768353471.627:675): prog-id=225 op=UNLOAD Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:17:51.627000 audit: BPF prog-id=227 op=LOAD Jan 14 01:17:51.627000 audit[4985]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd7bf5e170 a2=94 a3=7ffd7bf5e350 items=0 ppid=4969 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:17:51.627000 audit: BPF prog-id=227 op=UNLOAD Jan 14 01:17:51.627000 audit[4985]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffd7bf5e170 a2=94 a3=7ffd7bf5e350 items=0 ppid=4969 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:17:51.627000 audit: BPF prog-id=228 op=LOAD Jan 14 01:17:51.627000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff6e2ef490 a2=98 a3=3 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.627000 audit: BPF prog-id=228 op=UNLOAD Jan 14 01:17:51.627000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff6e2ef460 a3=0 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.627000 audit: BPF prog-id=229 op=LOAD Jan 14 01:17:51.627000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff6e2ef280 a2=94 a3=54428f items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.627000 audit: BPF prog-id=229 op=UNLOAD Jan 14 01:17:51.627000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff6e2ef280 a2=94 a3=54428f items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.627000 audit: BPF prog-id=230 op=LOAD Jan 14 01:17:51.627000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff6e2ef2b0 a2=94 a3=2 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.627000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.628000 audit: BPF prog-id=230 op=UNLOAD Jan 14 01:17:51.628000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff6e2ef2b0 a2=0 a3=2 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.628000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.775000 audit: BPF prog-id=231 op=LOAD Jan 14 01:17:51.775000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff6e2ef170 a2=94 a3=1 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.775000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.775000 audit: BPF prog-id=231 op=UNLOAD Jan 14 01:17:51.775000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff6e2ef170 a2=94 a3=1 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.775000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.782000 audit: BPF prog-id=232 op=LOAD Jan 14 01:17:51.782000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff6e2ef160 a2=94 a3=4 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.782000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.782000 audit: BPF prog-id=232 op=UNLOAD Jan 14 01:17:51.782000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff6e2ef160 a2=0 a3=4 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.782000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=233 op=LOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff6e2eefc0 a2=94 a3=5 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=233 op=UNLOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff6e2eefc0 a2=0 a3=5 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=234 op=LOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff6e2ef1e0 a2=94 a3=6 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=234 op=UNLOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff6e2ef1e0 a2=0 a3=6 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=235 op=LOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff6e2ee990 a2=94 a3=88 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=236 op=LOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7fff6e2ee810 a2=94 a3=2 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=236 op=UNLOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7fff6e2ee840 a2=0 a3=7fff6e2ee940 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.783000 audit: BPF prog-id=235 op=UNLOAD Jan 14 01:17:51.783000 audit[4986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=24388d10 a2=0 a3=951f7015a27591c2 items=0 ppid=4969 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.783000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:17:51.792000 audit: BPF prog-id=237 op=LOAD Jan 14 01:17:51.792000 audit[4989]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffa9391fe0 a2=98 a3=1999999999999999 items=0 ppid=4969 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.792000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:17:51.792000 audit: BPF prog-id=237 op=UNLOAD Jan 14 01:17:51.792000 audit[4989]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fffa9391fb0 a3=0 items=0 ppid=4969 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.792000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:17:51.792000 audit: BPF prog-id=238 op=LOAD Jan 14 01:17:51.792000 audit[4989]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffa9391ec0 a2=94 a3=ffff items=0 ppid=4969 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.792000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:17:51.792000 audit: BPF prog-id=238 op=UNLOAD Jan 14 01:17:51.792000 audit[4989]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffa9391ec0 a2=94 a3=ffff items=0 ppid=4969 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.792000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:17:51.792000 audit: BPF prog-id=239 op=LOAD Jan 14 01:17:51.792000 audit[4989]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffa9391f00 a2=94 a3=7fffa93920e0 items=0 ppid=4969 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.792000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:17:51.792000 audit: BPF prog-id=239 op=UNLOAD Jan 14 01:17:51.792000 audit[4989]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffa9391f00 a2=94 a3=7fffa93920e0 items=0 ppid=4969 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.792000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:17:51.865072 systemd-networkd[1572]: vxlan.calico: Link UP Jan 14 01:17:51.865082 systemd-networkd[1572]: vxlan.calico: Gained carrier Jan 14 01:17:51.903000 audit: BPF prog-id=240 op=LOAD Jan 14 01:17:51.903000 audit[5016]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffebbc8300 a2=98 a3=0 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.903000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.904000 audit: BPF prog-id=240 op=UNLOAD Jan 14 01:17:51.904000 audit[5016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fffebbc82d0 a3=0 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.904000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=241 op=LOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffebbc8110 a2=94 a3=54428f items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=241 op=UNLOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffebbc8110 a2=94 a3=54428f items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=242 op=LOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffebbc8140 a2=94 a3=2 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=242 op=UNLOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffebbc8140 a2=0 a3=2 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=243 op=LOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fffebbc7ef0 a2=94 a3=4 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=243 op=UNLOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fffebbc7ef0 a2=94 a3=4 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=244 op=LOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fffebbc7ff0 a2=94 a3=7fffebbc8170 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.906000 audit: BPF prog-id=244 op=UNLOAD Jan 14 01:17:51.906000 audit[5016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fffebbc7ff0 a2=0 a3=7fffebbc8170 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.906000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.914000 audit: BPF prog-id=245 op=LOAD Jan 14 01:17:51.914000 audit[5016]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fffebbc7720 a2=94 a3=2 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.914000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.914000 audit: BPF prog-id=245 op=UNLOAD Jan 14 01:17:51.914000 audit[5016]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fffebbc7720 a2=0 a3=2 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.914000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.918000 audit: BPF prog-id=246 op=LOAD Jan 14 01:17:51.918000 audit[5016]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fffebbc7820 a2=94 a3=30 items=0 ppid=4969 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.918000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:17:51.931000 audit: BPF prog-id=247 op=LOAD Jan 14 01:17:51.931000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe65d29740 a2=98 a3=0 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.931000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:51.931000 audit: BPF prog-id=247 op=UNLOAD Jan 14 01:17:51.931000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe65d29710 a3=0 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.931000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:51.931000 audit: BPF prog-id=248 op=LOAD Jan 14 01:17:51.931000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe65d29530 a2=94 a3=54428f items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.931000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:51.931000 audit: BPF prog-id=248 op=UNLOAD Jan 14 01:17:51.931000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe65d29530 a2=94 a3=54428f items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.931000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:51.931000 audit: BPF prog-id=249 op=LOAD Jan 14 01:17:51.931000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe65d29560 a2=94 a3=2 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.931000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:51.931000 audit: BPF prog-id=249 op=UNLOAD Jan 14 01:17:51.931000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe65d29560 a2=0 a3=2 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:51.931000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.081000 audit: BPF prog-id=250 op=LOAD Jan 14 01:17:52.081000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe65d29420 a2=94 a3=1 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.081000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.081000 audit: BPF prog-id=250 op=UNLOAD Jan 14 01:17:52.081000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe65d29420 a2=94 a3=1 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.081000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.088000 audit: BPF prog-id=251 op=LOAD Jan 14 01:17:52.088000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe65d29410 a2=94 a3=4 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.088000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.088000 audit: BPF prog-id=251 op=UNLOAD Jan 14 01:17:52.088000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe65d29410 a2=0 a3=4 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.088000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.089000 audit: BPF prog-id=252 op=LOAD Jan 14 01:17:52.089000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe65d29270 a2=94 a3=5 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.089000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.089000 audit: BPF prog-id=252 op=UNLOAD Jan 14 01:17:52.089000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe65d29270 a2=0 a3=5 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.089000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.089000 audit: BPF prog-id=253 op=LOAD Jan 14 01:17:52.089000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe65d29490 a2=94 a3=6 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.089000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.089000 audit: BPF prog-id=253 op=UNLOAD Jan 14 01:17:52.089000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe65d29490 a2=0 a3=6 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.089000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.089000 audit: BPF prog-id=254 op=LOAD Jan 14 01:17:52.089000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe65d28c40 a2=94 a3=88 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.089000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.089000 audit: BPF prog-id=255 op=LOAD Jan 14 01:17:52.089000 audit[5022]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffe65d28ac0 a2=94 a3=2 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.089000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.089000 audit: BPF prog-id=255 op=UNLOAD Jan 14 01:17:52.089000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffe65d28af0 a2=0 a3=7ffe65d28bf0 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.089000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.090000 audit: BPF prog-id=254 op=UNLOAD Jan 14 01:17:52.090000 audit[5022]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=b3ecd10 a2=0 a3=c046b07faf73c610 items=0 ppid=4969 pid=5022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.090000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:17:52.095000 audit: BPF prog-id=246 op=UNLOAD Jan 14 01:17:52.095000 audit[4969]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c001206240 a2=0 a3=0 items=0 ppid=4030 pid=4969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.095000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 14 01:17:52.215000 audit[5058]: NETFILTER_CFG table=mangle:129 family=2 entries=16 op=nft_register_chain pid=5058 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:17:52.215000 audit[5058]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7fff0dba3e90 a2=0 a3=7fff0dba3e7c items=0 ppid=4969 pid=5058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.215000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:17:52.218000 audit[5055]: NETFILTER_CFG table=nat:130 family=2 entries=15 op=nft_register_chain pid=5055 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:17:52.218000 audit[5055]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffe6e1c2d40 a2=0 a3=7ffe6e1c2d2c items=0 ppid=4969 pid=5055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.218000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:17:52.227000 audit[5054]: NETFILTER_CFG table=raw:131 family=2 entries=21 op=nft_register_chain pid=5054 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:17:52.227000 audit[5054]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffd484fbc40 a2=0 a3=7ffd484fbc2c items=0 ppid=4969 pid=5054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.227000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:17:52.232000 audit[5053]: NETFILTER_CFG table=filter:132 family=2 entries=315 op=nft_register_chain pid=5053 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:17:52.232000 audit[5053]: SYSCALL arch=c000003e syscall=46 success=yes exit=187764 a0=3 a1=7ffe63e8d1a0 a2=0 a3=7ffe63e8d18c items=0 ppid=4969 pid=5053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.232000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:17:52.336586 kubelet[2863]: I0114 01:17:52.336283 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:17:52.357000 audit[5074]: NETFILTER_CFG table=filter:133 family=2 entries=15 op=nft_register_rule pid=5074 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:52.357000 audit[5074]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe943ec880 a2=0 a3=7ffe943ec86c items=0 ppid=3013 pid=5074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.357000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:52.364000 audit[5074]: NETFILTER_CFG table=nat:134 family=2 entries=25 op=nft_register_chain pid=5074 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:17:52.364000 audit[5074]: SYSCALL arch=c000003e syscall=46 success=yes exit=8580 a0=3 a1=7ffe943ec880 a2=0 a3=7ffe943ec86c items=0 ppid=3013 pid=5074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:17:52.364000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:17:53.895254 systemd-networkd[1572]: vxlan.calico: Gained IPv6LL Jan 14 01:17:53.993883 containerd[1677]: time="2026-01-14T01:17:53.993823354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:17:54.420621 containerd[1677]: time="2026-01-14T01:17:54.420554915Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:54.422265 containerd[1677]: time="2026-01-14T01:17:54.422192635Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:17:54.422521 containerd[1677]: time="2026-01-14T01:17:54.422221505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:54.422564 kubelet[2863]: E0114 01:17:54.422507 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:17:54.422966 kubelet[2863]: E0114 01:17:54.422561 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:17:54.422966 kubelet[2863]: E0114 01:17:54.422712 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0fc0a424dd174edd96fbff1617ff49ab,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:54.425308 containerd[1677]: time="2026-01-14T01:17:54.425256734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:17:54.857808 containerd[1677]: time="2026-01-14T01:17:54.857737676Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:54.859486 containerd[1677]: time="2026-01-14T01:17:54.859391646Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:17:54.859486 containerd[1677]: time="2026-01-14T01:17:54.859438436Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:54.859719 kubelet[2863]: E0114 01:17:54.859639 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:17:54.859719 kubelet[2863]: E0114 01:17:54.859714 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:17:54.859896 kubelet[2863]: E0114 01:17:54.859837 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:54.861351 kubelet[2863]: E0114 01:17:54.861302 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:17:57.991331 containerd[1677]: time="2026-01-14T01:17:57.991292598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:17:58.399417 containerd[1677]: time="2026-01-14T01:17:58.399208200Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:17:58.401279 containerd[1677]: time="2026-01-14T01:17:58.401149589Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:17:58.401279 containerd[1677]: time="2026-01-14T01:17:58.401230899Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:17:58.401621 kubelet[2863]: E0114 01:17:58.401481 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:17:58.401621 kubelet[2863]: E0114 01:17:58.401542 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:17:58.402381 kubelet[2863]: E0114 01:17:58.401754 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcsgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-85b4f9c766-pwdc6_calico-system(8ba22e1c-b895-4e68-8414-171d12dc9bef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:17:58.403681 kubelet[2863]: E0114 01:17:58.403580 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:17:59.991700 containerd[1677]: time="2026-01-14T01:17:59.991624726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:18:00.412802 containerd[1677]: time="2026-01-14T01:18:00.412610085Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:00.414034 containerd[1677]: time="2026-01-14T01:18:00.413972156Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:18:00.414108 containerd[1677]: time="2026-01-14T01:18:00.414068206Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:00.414310 kubelet[2863]: E0114 01:18:00.414216 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:18:00.414310 kubelet[2863]: E0114 01:18:00.414268 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:18:00.415357 kubelet[2863]: E0114 01:18:00.414409 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:00.416879 containerd[1677]: time="2026-01-14T01:18:00.416591606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:18:00.842578 containerd[1677]: time="2026-01-14T01:18:00.842505617Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:00.844205 containerd[1677]: time="2026-01-14T01:18:00.844107687Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:18:00.844400 containerd[1677]: time="2026-01-14T01:18:00.844225727Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:00.844454 kubelet[2863]: E0114 01:18:00.844419 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:18:00.844519 kubelet[2863]: E0114 01:18:00.844486 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:18:00.844751 kubelet[2863]: E0114 01:18:00.844638 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:00.845948 kubelet[2863]: E0114 01:18:00.845890 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:18:01.991336 containerd[1677]: time="2026-01-14T01:18:01.991273583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:18:02.413852 containerd[1677]: time="2026-01-14T01:18:02.413689618Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:02.415367 containerd[1677]: time="2026-01-14T01:18:02.415300208Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:18:02.415494 containerd[1677]: time="2026-01-14T01:18:02.415404268Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:02.415630 kubelet[2863]: E0114 01:18:02.415562 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:18:02.415630 kubelet[2863]: E0114 01:18:02.415616 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:18:02.416107 kubelet[2863]: E0114 01:18:02.415777 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snjlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-l6m4k_calico-system(a11f31d9-176c-489f-9754-8429b5bd5389): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:02.417346 kubelet[2863]: E0114 01:18:02.417300 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:18:02.993694 containerd[1677]: time="2026-01-14T01:18:02.993200218Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:18:03.419970 containerd[1677]: time="2026-01-14T01:18:03.419820001Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:03.421491 containerd[1677]: time="2026-01-14T01:18:03.421413851Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:18:03.421630 containerd[1677]: time="2026-01-14T01:18:03.421511201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:03.421728 kubelet[2863]: E0114 01:18:03.421668 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:03.422211 kubelet[2863]: E0114 01:18:03.421733 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:03.422211 kubelet[2863]: E0114 01:18:03.421907 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlp4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-w8fr4_calico-apiserver(e9b7dca9-0cc9-40e4-b746-163e923a9fd3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:03.423463 kubelet[2863]: E0114 01:18:03.423393 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:18:04.996419 containerd[1677]: time="2026-01-14T01:18:04.994836448Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:18:05.425610 containerd[1677]: time="2026-01-14T01:18:05.425242244Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:05.426980 containerd[1677]: time="2026-01-14T01:18:05.426883175Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:18:05.427098 containerd[1677]: time="2026-01-14T01:18:05.426996795Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:05.427316 kubelet[2863]: E0114 01:18:05.427266 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:05.427854 kubelet[2863]: E0114 01:18:05.427330 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:05.427854 kubelet[2863]: E0114 01:18:05.427496 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncwmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-dllbx_calico-apiserver(0c40b23d-843d-4125-8dd6-0c99d44bb1dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:05.429692 kubelet[2863]: E0114 01:18:05.429539 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:18:07.992360 kubelet[2863]: E0114 01:18:07.992283 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:18:12.995056 kubelet[2863]: E0114 01:18:12.994629 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:18:12.997622 kubelet[2863]: E0114 01:18:12.997334 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:18:14.996721 kubelet[2863]: E0114 01:18:14.996606 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:18:16.992145 kubelet[2863]: E0114 01:18:16.991925 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:18:18.995253 kubelet[2863]: E0114 01:18:18.994903 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:18:19.993036 containerd[1677]: time="2026-01-14T01:18:19.992874154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:18:20.415903 containerd[1677]: time="2026-01-14T01:18:20.415738409Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:20.417166 containerd[1677]: time="2026-01-14T01:18:20.417125741Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:18:20.417386 containerd[1677]: time="2026-01-14T01:18:20.417142631Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:20.417424 kubelet[2863]: E0114 01:18:20.417380 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:18:20.417816 kubelet[2863]: E0114 01:18:20.417429 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:18:20.417816 kubelet[2863]: E0114 01:18:20.417549 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0fc0a424dd174edd96fbff1617ff49ab,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:20.420089 containerd[1677]: time="2026-01-14T01:18:20.420032745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:18:20.843742 containerd[1677]: time="2026-01-14T01:18:20.843677450Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:20.844930 containerd[1677]: time="2026-01-14T01:18:20.844882169Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:18:20.844996 containerd[1677]: time="2026-01-14T01:18:20.844954430Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:20.845206 kubelet[2863]: E0114 01:18:20.845153 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:18:20.845206 kubelet[2863]: E0114 01:18:20.845202 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:18:20.845438 kubelet[2863]: E0114 01:18:20.845308 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:20.847037 kubelet[2863]: E0114 01:18:20.846610 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:18:25.001489 containerd[1677]: time="2026-01-14T01:18:25.001232089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:18:25.429831 containerd[1677]: time="2026-01-14T01:18:25.429430804Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:25.431863 containerd[1677]: time="2026-01-14T01:18:25.431738825Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:18:25.431863 containerd[1677]: time="2026-01-14T01:18:25.431812336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:25.432351 kubelet[2863]: E0114 01:18:25.432303 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:18:25.433251 kubelet[2863]: E0114 01:18:25.432655 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:18:25.435073 kubelet[2863]: E0114 01:18:25.434252 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcsgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-85b4f9c766-pwdc6_calico-system(8ba22e1c-b895-4e68-8414-171d12dc9bef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:25.436525 kubelet[2863]: E0114 01:18:25.436464 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:18:26.997075 containerd[1677]: time="2026-01-14T01:18:26.994851281Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:18:27.413998 containerd[1677]: time="2026-01-14T01:18:27.413458211Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:27.415966 containerd[1677]: time="2026-01-14T01:18:27.415760290Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:18:27.415966 containerd[1677]: time="2026-01-14T01:18:27.415893732Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:27.418334 kubelet[2863]: E0114 01:18:27.416339 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:18:27.418334 kubelet[2863]: E0114 01:18:27.416418 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:18:27.418334 kubelet[2863]: E0114 01:18:27.416649 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:27.420705 containerd[1677]: time="2026-01-14T01:18:27.420639972Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:18:27.846368 containerd[1677]: time="2026-01-14T01:18:27.846300198Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:27.847636 containerd[1677]: time="2026-01-14T01:18:27.847587394Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:18:27.848470 containerd[1677]: time="2026-01-14T01:18:27.847669985Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:27.848685 kubelet[2863]: E0114 01:18:27.848627 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:18:27.848685 kubelet[2863]: E0114 01:18:27.848681 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:18:27.848987 kubelet[2863]: E0114 01:18:27.848858 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:27.850525 kubelet[2863]: E0114 01:18:27.850446 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:18:27.991690 containerd[1677]: time="2026-01-14T01:18:27.991627633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:18:28.423676 containerd[1677]: time="2026-01-14T01:18:28.423617890Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:28.425260 containerd[1677]: time="2026-01-14T01:18:28.425200919Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:18:28.425360 containerd[1677]: time="2026-01-14T01:18:28.425213060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:28.426550 kubelet[2863]: E0114 01:18:28.426498 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:18:28.427318 kubelet[2863]: E0114 01:18:28.426657 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:18:28.427318 kubelet[2863]: E0114 01:18:28.427036 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snjlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-l6m4k_calico-system(a11f31d9-176c-489f-9754-8429b5bd5389): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:28.428324 kubelet[2863]: E0114 01:18:28.428280 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:18:28.994805 containerd[1677]: time="2026-01-14T01:18:28.994751495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:18:29.428142 containerd[1677]: time="2026-01-14T01:18:29.427964478Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:29.429142 containerd[1677]: time="2026-01-14T01:18:29.429085731Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:18:29.430098 containerd[1677]: time="2026-01-14T01:18:29.429146922Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:29.430603 kubelet[2863]: E0114 01:18:29.430439 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:29.430603 kubelet[2863]: E0114 01:18:29.430529 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:29.431862 kubelet[2863]: E0114 01:18:29.431191 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlp4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-w8fr4_calico-apiserver(e9b7dca9-0cc9-40e4-b746-163e923a9fd3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:29.432921 kubelet[2863]: E0114 01:18:29.432863 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:18:30.994533 containerd[1677]: time="2026-01-14T01:18:30.994220785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:18:31.416892 containerd[1677]: time="2026-01-14T01:18:31.416274147Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:18:31.417621 containerd[1677]: time="2026-01-14T01:18:31.417542301Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:18:31.417904 containerd[1677]: time="2026-01-14T01:18:31.417604072Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:18:31.418477 kubelet[2863]: E0114 01:18:31.418388 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:31.420276 kubelet[2863]: E0114 01:18:31.418601 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:18:31.420276 kubelet[2863]: E0114 01:18:31.418916 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncwmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-dllbx_calico-apiserver(0c40b23d-843d-4125-8dd6-0c99d44bb1dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:18:31.420276 kubelet[2863]: E0114 01:18:31.420062 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:18:32.996891 kubelet[2863]: E0114 01:18:32.996791 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:18:35.992241 kubelet[2863]: E0114 01:18:35.992172 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:18:38.994940 kubelet[2863]: E0114 01:18:38.994697 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:18:40.992706 kubelet[2863]: E0114 01:18:40.992650 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:18:40.998092 kubelet[2863]: E0114 01:18:40.998039 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:18:45.993060 kubelet[2863]: E0114 01:18:45.992040 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:18:46.353721 systemd[1]: Started sshd@7-77.42.79.167:22-68.220.241.50:32890.service - OpenSSH per-connection server daemon (68.220.241.50:32890). Jan 14 01:18:46.353000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.79.167:22-68.220.241.50:32890 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:46.355378 kernel: kauditd_printk_skb: 200 callbacks suppressed Jan 14 01:18:46.355435 kernel: audit: type=1130 audit(1768353526.353:742): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.79.167:22-68.220.241.50:32890 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:46.995030 kubelet[2863]: E0114 01:18:46.994959 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:18:46.995828 kubelet[2863]: E0114 01:18:46.995794 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:18:47.047648 sshd[5197]: Accepted publickey for core from 68.220.241.50 port 32890 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:18:47.046000 audit[5197]: USER_ACCT pid=5197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.053082 sshd-session[5197]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:18:47.055636 kernel: audit: type=1101 audit(1768353527.046:743): pid=5197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.049000 audit[5197]: CRED_ACQ pid=5197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.062230 kernel: audit: type=1103 audit(1768353527.049:744): pid=5197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.066532 systemd-logind[1657]: New session 9 of user core. Jan 14 01:18:47.069028 kernel: audit: type=1006 audit(1768353527.049:745): pid=5197 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 14 01:18:47.070246 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 14 01:18:47.049000 audit[5197]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb9bc85c0 a2=3 a3=0 items=0 ppid=1 pid=5197 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:18:47.078065 kernel: audit: type=1300 audit(1768353527.049:745): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb9bc85c0 a2=3 a3=0 items=0 ppid=1 pid=5197 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:18:47.085438 kernel: audit: type=1327 audit(1768353527.049:745): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:18:47.049000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:18:47.084000 audit[5197]: USER_START pid=5197 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.088000 audit[5201]: CRED_ACQ pid=5201 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.096361 kernel: audit: type=1105 audit(1768353527.084:746): pid=5197 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.096424 kernel: audit: type=1103 audit(1768353527.088:747): pid=5201 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.518069 sshd[5201]: Connection closed by 68.220.241.50 port 32890 Jan 14 01:18:47.519521 sshd-session[5197]: pam_unix(sshd:session): session closed for user core Jan 14 01:18:47.523000 audit[5197]: USER_END pid=5197 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.547229 kernel: audit: type=1106 audit(1768353527.523:748): pid=5197 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.549413 systemd[1]: sshd@7-77.42.79.167:22-68.220.241.50:32890.service: Deactivated successfully. Jan 14 01:18:47.550666 systemd-logind[1657]: Session 9 logged out. Waiting for processes to exit. Jan 14 01:18:47.558508 systemd[1]: session-9.scope: Deactivated successfully. Jan 14 01:18:47.523000 audit[5197]: CRED_DISP pid=5197 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.570059 kernel: audit: type=1104 audit(1768353527.523:749): pid=5197 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:47.573886 systemd-logind[1657]: Removed session 9. Jan 14 01:18:47.550000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.79.167:22-68.220.241.50:32890 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:50.997295 kubelet[2863]: E0114 01:18:50.997185 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:18:52.653268 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:18:52.653393 kernel: audit: type=1130 audit(1768353532.651:751): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.79.167:22-68.220.241.50:45220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:52.651000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.79.167:22-68.220.241.50:45220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:52.652140 systemd[1]: Started sshd@8-77.42.79.167:22-68.220.241.50:45220.service - OpenSSH per-connection server daemon (68.220.241.50:45220). Jan 14 01:18:53.001029 kubelet[2863]: E0114 01:18:52.999784 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:18:53.307000 audit[5238]: USER_ACCT pid=5238 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.320053 kernel: audit: type=1101 audit(1768353533.307:752): pid=5238 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.311839 sshd-session[5238]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:18:53.320730 sshd[5238]: Accepted publickey for core from 68.220.241.50 port 45220 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:18:53.307000 audit[5238]: CRED_ACQ pid=5238 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.336114 kernel: audit: type=1103 audit(1768353533.307:753): pid=5238 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.336526 kernel: audit: type=1006 audit(1768353533.310:754): pid=5238 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 14 01:18:53.336545 kernel: audit: type=1300 audit(1768353533.310:754): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8c223070 a2=3 a3=0 items=0 ppid=1 pid=5238 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:18:53.310000 audit[5238]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe8c223070 a2=3 a3=0 items=0 ppid=1 pid=5238 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:18:53.342167 kernel: audit: type=1327 audit(1768353533.310:754): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:18:53.310000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:18:53.346368 systemd-logind[1657]: New session 10 of user core. Jan 14 01:18:53.352249 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 14 01:18:53.364863 kernel: audit: type=1105 audit(1768353533.356:755): pid=5238 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.356000 audit[5238]: USER_START pid=5238 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.364000 audit[5242]: CRED_ACQ pid=5242 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.372068 kernel: audit: type=1103 audit(1768353533.364:756): pid=5242 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.791848 sshd[5242]: Connection closed by 68.220.241.50 port 45220 Jan 14 01:18:53.792796 sshd-session[5238]: pam_unix(sshd:session): session closed for user core Jan 14 01:18:53.794000 audit[5238]: USER_END pid=5238 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.802907 systemd[1]: sshd@8-77.42.79.167:22-68.220.241.50:45220.service: Deactivated successfully. Jan 14 01:18:53.808683 systemd[1]: session-10.scope: Deactivated successfully. Jan 14 01:18:53.814073 kernel: audit: type=1106 audit(1768353533.794:757): pid=5238 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.795000 audit[5238]: CRED_DISP pid=5238 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.818666 systemd-logind[1657]: Session 10 logged out. Waiting for processes to exit. Jan 14 01:18:53.821795 systemd-logind[1657]: Removed session 10. Jan 14 01:18:53.827087 kernel: audit: type=1104 audit(1768353533.795:758): pid=5238 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:53.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.79.167:22-68.220.241.50:45220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:53.992691 kubelet[2863]: E0114 01:18:53.992600 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:18:56.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.79.167:22-188.166.93.218:44498 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:56.240488 systemd[1]: Started sshd@9-77.42.79.167:22-188.166.93.218:44498.service - OpenSSH per-connection server daemon (188.166.93.218:44498). Jan 14 01:18:56.306669 sshd[5255]: Connection closed by 188.166.93.218 port 44498 Jan 14 01:18:56.307855 systemd[1]: sshd@9-77.42.79.167:22-188.166.93.218:44498.service: Deactivated successfully. Jan 14 01:18:56.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.79.167:22-188.166.93.218:44498 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:58.931054 kernel: kauditd_printk_skb: 3 callbacks suppressed Jan 14 01:18:58.932301 kernel: audit: type=1130 audit(1768353538.928:762): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.79.167:22-68.220.241.50:45228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:58.928000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.79.167:22-68.220.241.50:45228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:18:58.929496 systemd[1]: Started sshd@10-77.42.79.167:22-68.220.241.50:45228.service - OpenSSH per-connection server daemon (68.220.241.50:45228). Jan 14 01:18:58.994448 kubelet[2863]: E0114 01:18:58.994391 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:18:59.593393 sshd[5260]: Accepted publickey for core from 68.220.241.50 port 45228 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:18:59.609334 kernel: audit: type=1101 audit(1768353539.592:763): pid=5260 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:59.592000 audit[5260]: USER_ACCT pid=5260 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:59.619067 sshd-session[5260]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:18:59.614000 audit[5260]: CRED_ACQ pid=5260 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:59.634481 kernel: audit: type=1103 audit(1768353539.614:764): pid=5260 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:59.644784 kernel: audit: type=1006 audit(1768353539.614:765): pid=5260 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 14 01:18:59.643688 systemd-logind[1657]: New session 11 of user core. Jan 14 01:18:59.614000 audit[5260]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffdd8bae00 a2=3 a3=0 items=0 ppid=1 pid=5260 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:18:59.656288 kernel: audit: type=1300 audit(1768353539.614:765): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffdd8bae00 a2=3 a3=0 items=0 ppid=1 pid=5260 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:18:59.656382 kernel: audit: type=1327 audit(1768353539.614:765): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:18:59.614000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:18:59.656327 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 14 01:18:59.663000 audit[5260]: USER_START pid=5260 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:59.673399 kernel: audit: type=1105 audit(1768353539.663:766): pid=5260 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:59.674000 audit[5264]: CRED_ACQ pid=5264 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:18:59.682041 kernel: audit: type=1103 audit(1768353539.674:767): pid=5264 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.046286 sshd[5264]: Connection closed by 68.220.241.50 port 45228 Jan 14 01:19:00.047805 sshd-session[5260]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:00.048000 audit[5260]: USER_END pid=5260 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.055424 systemd[1]: sshd@10-77.42.79.167:22-68.220.241.50:45228.service: Deactivated successfully. Jan 14 01:19:00.058083 systemd[1]: session-11.scope: Deactivated successfully. Jan 14 01:19:00.059774 kernel: audit: type=1106 audit(1768353540.048:768): pid=5260 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.059827 kernel: audit: type=1104 audit(1768353540.048:769): pid=5260 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.048000 audit[5260]: CRED_DISP pid=5260 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.061642 systemd-logind[1657]: Session 11 logged out. Waiting for processes to exit. Jan 14 01:19:00.063673 systemd-logind[1657]: Removed session 11. Jan 14 01:19:00.054000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.79.167:22-68.220.241.50:45228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:00.183467 systemd[1]: Started sshd@11-77.42.79.167:22-68.220.241.50:45242.service - OpenSSH per-connection server daemon (68.220.241.50:45242). Jan 14 01:19:00.182000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.79.167:22-68.220.241.50:45242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:00.865000 audit[5277]: USER_ACCT pid=5277 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.867155 sshd[5277]: Accepted publickey for core from 68.220.241.50 port 45242 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:00.867000 audit[5277]: CRED_ACQ pid=5277 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.867000 audit[5277]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc6b1ce200 a2=3 a3=0 items=0 ppid=1 pid=5277 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:00.867000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:00.870479 sshd-session[5277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:00.874807 systemd-logind[1657]: New session 12 of user core. Jan 14 01:19:00.881179 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 14 01:19:00.885000 audit[5277]: USER_START pid=5277 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.888000 audit[5285]: CRED_ACQ pid=5285 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:00.994189 kubelet[2863]: E0114 01:19:00.993862 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:19:00.996867 containerd[1677]: time="2026-01-14T01:19:00.996794712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:19:01.355856 sshd[5285]: Connection closed by 68.220.241.50 port 45242 Jan 14 01:19:01.354575 sshd-session[5277]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:01.355000 audit[5277]: USER_END pid=5277 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:01.355000 audit[5277]: CRED_DISP pid=5277 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:01.359597 systemd-logind[1657]: Session 12 logged out. Waiting for processes to exit. Jan 14 01:19:01.362363 systemd[1]: sshd@11-77.42.79.167:22-68.220.241.50:45242.service: Deactivated successfully. Jan 14 01:19:01.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.79.167:22-68.220.241.50:45242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:01.367305 systemd[1]: session-12.scope: Deactivated successfully. Jan 14 01:19:01.370232 systemd-logind[1657]: Removed session 12. Jan 14 01:19:01.426659 containerd[1677]: time="2026-01-14T01:19:01.426496052Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:01.427904 containerd[1677]: time="2026-01-14T01:19:01.427800989Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:19:01.427904 containerd[1677]: time="2026-01-14T01:19:01.427873130Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:01.428137 kubelet[2863]: E0114 01:19:01.428096 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:19:01.428250 kubelet[2863]: E0114 01:19:01.428234 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:19:01.428448 kubelet[2863]: E0114 01:19:01.428412 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0fc0a424dd174edd96fbff1617ff49ab,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:01.431988 containerd[1677]: time="2026-01-14T01:19:01.431917873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:19:01.487416 systemd[1]: Started sshd@12-77.42.79.167:22-68.220.241.50:45248.service - OpenSSH per-connection server daemon (68.220.241.50:45248). Jan 14 01:19:01.486000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.79.167:22-68.220.241.50:45248 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:01.862692 containerd[1677]: time="2026-01-14T01:19:01.862591628Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:01.864521 containerd[1677]: time="2026-01-14T01:19:01.864297658Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:19:01.864521 containerd[1677]: time="2026-01-14T01:19:01.864359069Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:01.865060 kubelet[2863]: E0114 01:19:01.864944 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:19:01.865060 kubelet[2863]: E0114 01:19:01.865032 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:19:01.865254 kubelet[2863]: E0114 01:19:01.865157 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pgrmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-76bdf97696-8ln4x_calico-system(e35dc45d-4646-4773-8f3b-b3b00ec76393): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:01.866712 kubelet[2863]: E0114 01:19:01.866658 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:19:02.159000 audit[5295]: USER_ACCT pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:02.161212 sshd[5295]: Accepted publickey for core from 68.220.241.50 port 45248 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:02.162000 audit[5295]: CRED_ACQ pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:02.162000 audit[5295]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffde7c21fb0 a2=3 a3=0 items=0 ppid=1 pid=5295 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:02.162000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:02.164716 sshd-session[5295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:02.172397 systemd-logind[1657]: New session 13 of user core. Jan 14 01:19:02.179619 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 14 01:19:02.184000 audit[5295]: USER_START pid=5295 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:02.187000 audit[5299]: CRED_ACQ pid=5299 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:02.602457 sshd[5299]: Connection closed by 68.220.241.50 port 45248 Jan 14 01:19:02.603259 sshd-session[5295]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:02.605000 audit[5295]: USER_END pid=5295 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:02.607000 audit[5295]: CRED_DISP pid=5295 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:02.613062 systemd-logind[1657]: Session 13 logged out. Waiting for processes to exit. Jan 14 01:19:02.613749 systemd[1]: sshd@12-77.42.79.167:22-68.220.241.50:45248.service: Deactivated successfully. Jan 14 01:19:02.613000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.79.167:22-68.220.241.50:45248 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:02.617695 systemd[1]: session-13.scope: Deactivated successfully. Jan 14 01:19:02.622613 systemd-logind[1657]: Removed session 13. Jan 14 01:19:03.991184 kubelet[2863]: E0114 01:19:03.991097 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:19:04.996291 kubelet[2863]: E0114 01:19:04.995883 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:19:05.992153 kubelet[2863]: E0114 01:19:05.992107 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:19:07.735569 systemd[1]: Started sshd@13-77.42.79.167:22-68.220.241.50:55690.service - OpenSSH per-connection server daemon (68.220.241.50:55690). Jan 14 01:19:07.742256 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 14 01:19:07.742299 kernel: audit: type=1130 audit(1768353547.734:789): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.79.167:22-68.220.241.50:55690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:07.734000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.79.167:22-68.220.241.50:55690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:08.391000 audit[5313]: USER_ACCT pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.399192 sshd-session[5313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:08.402196 sshd[5313]: Accepted publickey for core from 68.220.241.50 port 55690 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:08.410095 kernel: audit: type=1101 audit(1768353548.391:790): pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.391000 audit[5313]: CRED_ACQ pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.438207 kernel: audit: type=1103 audit(1768353548.391:791): pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.438367 kernel: audit: type=1006 audit(1768353548.391:792): pid=5313 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 14 01:19:08.428721 systemd-logind[1657]: New session 14 of user core. Jan 14 01:19:08.391000 audit[5313]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd2494a130 a2=3 a3=0 items=0 ppid=1 pid=5313 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:08.453253 kernel: audit: type=1300 audit(1768353548.391:792): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd2494a130 a2=3 a3=0 items=0 ppid=1 pid=5313 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:08.391000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:08.454484 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 14 01:19:08.459067 kernel: audit: type=1327 audit(1768353548.391:792): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:08.459000 audit[5313]: USER_START pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.461000 audit[5317]: CRED_ACQ pid=5317 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.470995 kernel: audit: type=1105 audit(1768353548.459:793): pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.471185 kernel: audit: type=1103 audit(1768353548.461:794): pid=5317 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.859176 sshd[5317]: Connection closed by 68.220.241.50 port 55690 Jan 14 01:19:08.860191 sshd-session[5313]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:08.862000 audit[5313]: USER_END pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.874878 kernel: audit: type=1106 audit(1768353548.862:795): pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.874972 kernel: audit: type=1104 audit(1768353548.862:796): pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.862000 audit[5313]: CRED_DISP pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:08.871567 systemd[1]: sshd@13-77.42.79.167:22-68.220.241.50:55690.service: Deactivated successfully. Jan 14 01:19:08.874222 systemd[1]: session-14.scope: Deactivated successfully. Jan 14 01:19:08.877662 systemd-logind[1657]: Session 14 logged out. Waiting for processes to exit. Jan 14 01:19:08.880998 systemd-logind[1657]: Removed session 14. Jan 14 01:19:08.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.79.167:22-68.220.241.50:55690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:08.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.79.167:22-68.220.241.50:55704 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:08.993358 systemd[1]: Started sshd@14-77.42.79.167:22-68.220.241.50:55704.service - OpenSSH per-connection server daemon (68.220.241.50:55704). Jan 14 01:19:09.652000 audit[5328]: USER_ACCT pid=5328 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:09.653253 sshd[5328]: Accepted publickey for core from 68.220.241.50 port 55704 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:09.653000 audit[5328]: CRED_ACQ pid=5328 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:09.653000 audit[5328]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe9a89aa50 a2=3 a3=0 items=0 ppid=1 pid=5328 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:09.653000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:09.655762 sshd-session[5328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:09.661470 systemd-logind[1657]: New session 15 of user core. Jan 14 01:19:09.665231 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 14 01:19:09.667000 audit[5328]: USER_START pid=5328 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:09.669000 audit[5332]: CRED_ACQ pid=5332 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:09.992309 containerd[1677]: time="2026-01-14T01:19:09.992235598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:19:10.290156 sshd[5332]: Connection closed by 68.220.241.50 port 55704 Jan 14 01:19:10.291738 sshd-session[5328]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:10.298000 audit[5328]: USER_END pid=5328 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:10.298000 audit[5328]: CRED_DISP pid=5328 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:10.307327 systemd[1]: sshd@14-77.42.79.167:22-68.220.241.50:55704.service: Deactivated successfully. Jan 14 01:19:10.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.79.167:22-68.220.241.50:55704 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:10.314511 systemd[1]: session-15.scope: Deactivated successfully. Jan 14 01:19:10.318789 systemd-logind[1657]: Session 15 logged out. Waiting for processes to exit. Jan 14 01:19:10.326375 systemd-logind[1657]: Removed session 15. Jan 14 01:19:10.426442 containerd[1677]: time="2026-01-14T01:19:10.426392457Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:10.430481 systemd[1]: Started sshd@15-77.42.79.167:22-68.220.241.50:55716.service - OpenSSH per-connection server daemon (68.220.241.50:55716). Jan 14 01:19:10.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.79.167:22-68.220.241.50:55716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:10.431577 containerd[1677]: time="2026-01-14T01:19:10.430864828Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:19:10.432184 containerd[1677]: time="2026-01-14T01:19:10.431884323Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:10.433409 kubelet[2863]: E0114 01:19:10.433323 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:19:10.436712 kubelet[2863]: E0114 01:19:10.433432 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:19:10.439643 kubelet[2863]: E0114 01:19:10.438518 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcsgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-85b4f9c766-pwdc6_calico-system(8ba22e1c-b895-4e68-8414-171d12dc9bef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:10.440395 kubelet[2863]: E0114 01:19:10.440291 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:19:11.116000 audit[5342]: USER_ACCT pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:11.117856 sshd[5342]: Accepted publickey for core from 68.220.241.50 port 55716 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:11.117000 audit[5342]: CRED_ACQ pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:11.117000 audit[5342]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffefd65dff0 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:11.117000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:11.120079 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:11.129450 systemd-logind[1657]: New session 16 of user core. Jan 14 01:19:11.135230 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 14 01:19:11.139000 audit[5342]: USER_START pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:11.141000 audit[5346]: CRED_ACQ pid=5346 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:12.110000 audit[5358]: NETFILTER_CFG table=filter:135 family=2 entries=26 op=nft_register_rule pid=5358 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:19:12.110000 audit[5358]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffd3e326b50 a2=0 a3=7ffd3e326b3c items=0 ppid=3013 pid=5358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:12.110000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:19:12.119000 audit[5358]: NETFILTER_CFG table=nat:136 family=2 entries=20 op=nft_register_rule pid=5358 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:19:12.119000 audit[5358]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffd3e326b50 a2=0 a3=0 items=0 ppid=3013 pid=5358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:12.119000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:19:12.146000 audit[5360]: NETFILTER_CFG table=filter:137 family=2 entries=38 op=nft_register_rule pid=5360 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:19:12.146000 audit[5360]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffcc5af9ec0 a2=0 a3=7ffcc5af9eac items=0 ppid=3013 pid=5360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:12.146000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:19:12.150000 audit[5360]: NETFILTER_CFG table=nat:138 family=2 entries=20 op=nft_register_rule pid=5360 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:19:12.150000 audit[5360]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffcc5af9ec0 a2=0 a3=0 items=0 ppid=3013 pid=5360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:12.150000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:19:12.242033 sshd[5346]: Connection closed by 68.220.241.50 port 55716 Jan 14 01:19:12.245677 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:12.246000 audit[5342]: USER_END pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:12.247000 audit[5342]: CRED_DISP pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:12.254333 systemd[1]: sshd@15-77.42.79.167:22-68.220.241.50:55716.service: Deactivated successfully. Jan 14 01:19:12.253000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.79.167:22-68.220.241.50:55716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:12.256760 systemd[1]: session-16.scope: Deactivated successfully. Jan 14 01:19:12.257916 systemd-logind[1657]: Session 16 logged out. Waiting for processes to exit. Jan 14 01:19:12.260476 systemd-logind[1657]: Removed session 16. Jan 14 01:19:12.379323 systemd[1]: Started sshd@16-77.42.79.167:22-68.220.241.50:55720.service - OpenSSH per-connection server daemon (68.220.241.50:55720). Jan 14 01:19:12.378000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.79.167:22-68.220.241.50:55720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:12.995023 containerd[1677]: time="2026-01-14T01:19:12.994952653Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:19:13.066259 kernel: kauditd_printk_skb: 36 callbacks suppressed Jan 14 01:19:13.066420 kernel: audit: type=1101 audit(1768353553.055:821): pid=5373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.055000 audit[5373]: USER_ACCT pid=5373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.065666 sshd-session[5373]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:13.067165 sshd[5373]: Accepted publickey for core from 68.220.241.50 port 55720 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:13.088360 kernel: audit: type=1103 audit(1768353553.059:822): pid=5373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.059000 audit[5373]: CRED_ACQ pid=5373 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.086860 systemd-logind[1657]: New session 17 of user core. Jan 14 01:19:13.059000 audit[5373]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffbf6a0940 a2=3 a3=0 items=0 ppid=1 pid=5373 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:13.103611 kernel: audit: type=1006 audit(1768353553.059:823): pid=5373 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 14 01:19:13.103723 kernel: audit: type=1300 audit(1768353553.059:823): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffbf6a0940 a2=3 a3=0 items=0 ppid=1 pid=5373 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:13.104333 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 14 01:19:13.118489 kernel: audit: type=1327 audit(1768353553.059:823): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:13.059000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:13.113000 audit[5373]: USER_START pid=5373 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.131279 kernel: audit: type=1105 audit(1768353553.113:824): pid=5373 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.138173 kernel: audit: type=1103 audit(1768353553.118:825): pid=5377 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.118000 audit[5377]: CRED_ACQ pid=5377 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.429140 containerd[1677]: time="2026-01-14T01:19:13.428666893Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:13.431660 containerd[1677]: time="2026-01-14T01:19:13.431566336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:19:13.431660 containerd[1677]: time="2026-01-14T01:19:13.431636036Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:13.432245 kubelet[2863]: E0114 01:19:13.432137 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:19:13.433249 kubelet[2863]: E0114 01:19:13.432802 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:19:13.433351 kubelet[2863]: E0114 01:19:13.433325 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncwmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-dllbx_calico-apiserver(0c40b23d-843d-4125-8dd6-0c99d44bb1dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:13.434591 kubelet[2863]: E0114 01:19:13.434556 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:19:13.571859 sshd[5377]: Connection closed by 68.220.241.50 port 55720 Jan 14 01:19:13.574294 sshd-session[5373]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:13.577000 audit[5373]: USER_END pid=5373 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.595000 audit[5373]: CRED_DISP pid=5373 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.600121 kernel: audit: type=1106 audit(1768353553.577:826): pid=5373 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.600226 kernel: audit: type=1104 audit(1768353553.595:827): pid=5373 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:13.603866 systemd[1]: sshd@16-77.42.79.167:22-68.220.241.50:55720.service: Deactivated successfully. Jan 14 01:19:13.604000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.79.167:22-68.220.241.50:55720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:13.611433 systemd[1]: session-17.scope: Deactivated successfully. Jan 14 01:19:13.618600 systemd-logind[1657]: Session 17 logged out. Waiting for processes to exit. Jan 14 01:19:13.622281 kernel: audit: type=1131 audit(1768353553.604:828): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.79.167:22-68.220.241.50:55720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:13.622535 systemd-logind[1657]: Removed session 17. Jan 14 01:19:13.713818 systemd[1]: Started sshd@17-77.42.79.167:22-68.220.241.50:55180.service - OpenSSH per-connection server daemon (68.220.241.50:55180). Jan 14 01:19:13.713000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.79.167:22-68.220.241.50:55180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:14.399000 audit[5387]: USER_ACCT pid=5387 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:14.401156 sshd[5387]: Accepted publickey for core from 68.220.241.50 port 55180 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:14.401000 audit[5387]: CRED_ACQ pid=5387 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:14.401000 audit[5387]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffb6368140 a2=3 a3=0 items=0 ppid=1 pid=5387 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:14.401000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:14.403563 sshd-session[5387]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:14.409988 systemd-logind[1657]: New session 18 of user core. Jan 14 01:19:14.417228 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 14 01:19:14.420000 audit[5387]: USER_START pid=5387 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:14.423000 audit[5391]: CRED_ACQ pid=5391 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:14.884935 sshd[5391]: Connection closed by 68.220.241.50 port 55180 Jan 14 01:19:14.885623 sshd-session[5387]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:14.887000 audit[5387]: USER_END pid=5387 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:14.888000 audit[5387]: CRED_DISP pid=5387 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:14.893595 systemd[1]: sshd@17-77.42.79.167:22-68.220.241.50:55180.service: Deactivated successfully. Jan 14 01:19:14.893000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.79.167:22-68.220.241.50:55180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:14.901245 systemd[1]: session-18.scope: Deactivated successfully. Jan 14 01:19:14.907293 systemd-logind[1657]: Session 18 logged out. Waiting for processes to exit. Jan 14 01:19:14.908992 systemd-logind[1657]: Removed session 18. Jan 14 01:19:14.997605 kubelet[2863]: E0114 01:19:14.997534 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:19:15.991805 containerd[1677]: time="2026-01-14T01:19:15.991304327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:19:16.421607 containerd[1677]: time="2026-01-14T01:19:16.421399095Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:16.423084 containerd[1677]: time="2026-01-14T01:19:16.422984881Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:19:16.423211 containerd[1677]: time="2026-01-14T01:19:16.422992851Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:16.423741 kubelet[2863]: E0114 01:19:16.423468 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:19:16.423741 kubelet[2863]: E0114 01:19:16.423534 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:19:16.423741 kubelet[2863]: E0114 01:19:16.423678 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snjlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-l6m4k_calico-system(a11f31d9-176c-489f-9754-8429b5bd5389): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:16.425114 kubelet[2863]: E0114 01:19:16.425090 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:19:16.713000 audit[5403]: NETFILTER_CFG table=filter:139 family=2 entries=26 op=nft_register_rule pid=5403 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:19:16.713000 audit[5403]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffceb135de0 a2=0 a3=7ffceb135dcc items=0 ppid=3013 pid=5403 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:16.713000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:19:16.717000 audit[5403]: NETFILTER_CFG table=nat:140 family=2 entries=104 op=nft_register_chain pid=5403 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:19:16.717000 audit[5403]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffceb135de0 a2=0 a3=7ffceb135dcc items=0 ppid=3013 pid=5403 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:16.717000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:19:16.992876 containerd[1677]: time="2026-01-14T01:19:16.992677329Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:19:17.424065 containerd[1677]: time="2026-01-14T01:19:17.423944451Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:17.425203 containerd[1677]: time="2026-01-14T01:19:17.425171967Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:19:17.425326 containerd[1677]: time="2026-01-14T01:19:17.425242127Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:17.425419 kubelet[2863]: E0114 01:19:17.425388 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:19:17.425700 kubelet[2863]: E0114 01:19:17.425432 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:19:17.425700 kubelet[2863]: E0114 01:19:17.425551 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlp4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-75bdffd95c-w8fr4_calico-apiserver(e9b7dca9-0cc9-40e4-b746-163e923a9fd3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:17.426981 kubelet[2863]: E0114 01:19:17.426953 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:19:20.023157 systemd[1]: Started sshd@18-77.42.79.167:22-68.220.241.50:55190.service - OpenSSH per-connection server daemon (68.220.241.50:55190). Jan 14 01:19:20.040642 kernel: kauditd_printk_skb: 17 callbacks suppressed Jan 14 01:19:20.040714 kernel: audit: type=1130 audit(1768353560.022:840): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.79.167:22-68.220.241.50:55190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:20.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.79.167:22-68.220.241.50:55190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:20.718000 audit[5405]: USER_ACCT pid=5405 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:20.721825 sshd[5405]: Accepted publickey for core from 68.220.241.50 port 55190 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:20.725271 sshd-session[5405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:20.741762 kernel: audit: type=1101 audit(1768353560.718:841): pid=5405 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:20.741869 kernel: audit: type=1103 audit(1768353560.721:842): pid=5405 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:20.721000 audit[5405]: CRED_ACQ pid=5405 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:20.741742 systemd-logind[1657]: New session 19 of user core. Jan 14 01:19:20.721000 audit[5405]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd7a0bc7e0 a2=3 a3=0 items=0 ppid=1 pid=5405 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:20.753153 kernel: audit: type=1006 audit(1768353560.721:843): pid=5405 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 14 01:19:20.753295 kernel: audit: type=1300 audit(1768353560.721:843): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd7a0bc7e0 a2=3 a3=0 items=0 ppid=1 pid=5405 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:20.763670 kernel: audit: type=1327 audit(1768353560.721:843): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:20.721000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:20.763345 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 14 01:19:20.770000 audit[5405]: USER_START pid=5405 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:20.778058 kernel: audit: type=1105 audit(1768353560.770:844): pid=5405 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:20.778000 audit[5409]: CRED_ACQ pid=5409 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:20.793068 kernel: audit: type=1103 audit(1768353560.778:845): pid=5409 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:21.003797 containerd[1677]: time="2026-01-14T01:19:21.001739864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:19:21.197786 sshd[5409]: Connection closed by 68.220.241.50 port 55190 Jan 14 01:19:21.198991 sshd-session[5405]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:21.199000 audit[5405]: USER_END pid=5405 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:21.206589 systemd[1]: sshd@18-77.42.79.167:22-68.220.241.50:55190.service: Deactivated successfully. Jan 14 01:19:21.199000 audit[5405]: CRED_DISP pid=5405 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:21.209594 kernel: audit: type=1106 audit(1768353561.199:846): pid=5405 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:21.209780 kernel: audit: type=1104 audit(1768353561.199:847): pid=5405 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:21.206000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.79.167:22-68.220.241.50:55190 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:21.214900 systemd[1]: session-19.scope: Deactivated successfully. Jan 14 01:19:21.222887 systemd-logind[1657]: Session 19 logged out. Waiting for processes to exit. Jan 14 01:19:21.225550 systemd-logind[1657]: Removed session 19. Jan 14 01:19:21.450833 containerd[1677]: time="2026-01-14T01:19:21.450628275Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:21.452495 containerd[1677]: time="2026-01-14T01:19:21.452355062Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:19:21.452664 containerd[1677]: time="2026-01-14T01:19:21.452437433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:21.453087 kubelet[2863]: E0114 01:19:21.453045 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:19:21.454768 kubelet[2863]: E0114 01:19:21.453097 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:19:21.454768 kubelet[2863]: E0114 01:19:21.453252 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:21.456065 containerd[1677]: time="2026-01-14T01:19:21.455872886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:19:21.872258 containerd[1677]: time="2026-01-14T01:19:21.871992278Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:19:21.874680 containerd[1677]: time="2026-01-14T01:19:21.873867856Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:19:21.874680 containerd[1677]: time="2026-01-14T01:19:21.873981326Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:19:21.874831 kubelet[2863]: E0114 01:19:21.874195 2863 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:19:21.874831 kubelet[2863]: E0114 01:19:21.874238 2863 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:19:21.874831 kubelet[2863]: E0114 01:19:21.874359 2863 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn7g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pj96q_calico-system(f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:19:21.875782 kubelet[2863]: E0114 01:19:21.875744 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:19:21.993524 kubelet[2863]: E0114 01:19:21.993231 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:19:25.992514 kubelet[2863]: E0114 01:19:25.992417 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:19:25.997768 update_engine[1658]: I20260114 01:19:25.997337 1658 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 14 01:19:25.997768 update_engine[1658]: I20260114 01:19:25.997411 1658 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 14 01:19:26.000724 update_engine[1658]: I20260114 01:19:25.999713 1658 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 14 01:19:26.001746 update_engine[1658]: I20260114 01:19:26.001686 1658 omaha_request_params.cc:62] Current group set to developer Jan 14 01:19:26.003792 update_engine[1658]: I20260114 01:19:26.003734 1658 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 14 01:19:26.004282 locksmithd[1706]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 14 01:19:26.004544 update_engine[1658]: I20260114 01:19:26.004305 1658 update_attempter.cc:643] Scheduling an action processor start. Jan 14 01:19:26.004544 update_engine[1658]: I20260114 01:19:26.004367 1658 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 01:19:26.013721 update_engine[1658]: I20260114 01:19:26.012260 1658 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 14 01:19:26.013721 update_engine[1658]: I20260114 01:19:26.012388 1658 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 01:19:26.013721 update_engine[1658]: I20260114 01:19:26.012399 1658 omaha_request_action.cc:272] Request: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: Jan 14 01:19:26.013721 update_engine[1658]: I20260114 01:19:26.012408 1658 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:19:26.017932 update_engine[1658]: I20260114 01:19:26.017874 1658 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:19:26.019214 update_engine[1658]: I20260114 01:19:26.019174 1658 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:19:26.019812 update_engine[1658]: E20260114 01:19:26.019762 1658 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:19:26.019916 update_engine[1658]: I20260114 01:19:26.019827 1658 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 14 01:19:26.339063 systemd[1]: Started sshd@19-77.42.79.167:22-68.220.241.50:56000.service - OpenSSH per-connection server daemon (68.220.241.50:56000). Jan 14 01:19:26.351340 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:19:26.351387 kernel: audit: type=1130 audit(1768353566.338:849): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.79.167:22-68.220.241.50:56000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:26.338000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.79.167:22-68.220.241.50:56000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:27.003000 audit[5460]: USER_ACCT pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.014542 kernel: audit: type=1101 audit(1768353567.003:850): pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.016437 sshd[5460]: Accepted publickey for core from 68.220.241.50 port 56000 ssh2: RSA SHA256:N606FmpNAhhKPVJS1dzQHMLugdJj8W5cT95Lwd2Z2DA Jan 14 01:19:27.016000 audit[5460]: CRED_ACQ pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.019695 sshd-session[5460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:19:27.033027 kernel: audit: type=1103 audit(1768353567.016:851): pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.033131 kernel: audit: type=1006 audit(1768353567.016:852): pid=5460 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 14 01:19:27.033149 kernel: audit: type=1300 audit(1768353567.016:852): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff9c936ed0 a2=3 a3=0 items=0 ppid=1 pid=5460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:27.016000 audit[5460]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff9c936ed0 a2=3 a3=0 items=0 ppid=1 pid=5460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:27.029287 systemd-logind[1657]: New session 20 of user core. Jan 14 01:19:27.016000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:27.043604 kernel: audit: type=1327 audit(1768353567.016:852): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:19:27.045800 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 14 01:19:27.050000 audit[5460]: USER_START pid=5460 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.061165 kernel: audit: type=1105 audit(1768353567.050:853): pid=5460 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.062000 audit[5464]: CRED_ACQ pid=5464 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.070062 kernel: audit: type=1103 audit(1768353567.062:854): pid=5464 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.449645 sshd[5464]: Connection closed by 68.220.241.50 port 56000 Jan 14 01:19:27.451491 sshd-session[5460]: pam_unix(sshd:session): session closed for user core Jan 14 01:19:27.474938 kernel: audit: type=1106 audit(1768353567.453:855): pid=5460 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.453000 audit[5460]: USER_END pid=5460 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.461570 systemd[1]: sshd@19-77.42.79.167:22-68.220.241.50:56000.service: Deactivated successfully. Jan 14 01:19:27.471133 systemd[1]: session-20.scope: Deactivated successfully. Jan 14 01:19:27.453000 audit[5460]: CRED_DISP pid=5460 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.494045 kernel: audit: type=1104 audit(1768353567.453:856): pid=5460 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 14 01:19:27.494241 systemd-logind[1657]: Session 20 logged out. Waiting for processes to exit. Jan 14 01:19:27.462000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.79.167:22-68.220.241.50:56000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:19:27.496518 systemd-logind[1657]: Removed session 20. Jan 14 01:19:27.993350 kubelet[2863]: E0114 01:19:27.993246 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:19:30.993600 kubelet[2863]: E0114 01:19:30.992977 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:19:31.992470 kubelet[2863]: E0114 01:19:31.992406 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:19:32.994695 kubelet[2863]: E0114 01:19:32.994596 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:19:35.991522 kubelet[2863]: E0114 01:19:35.991444 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:19:36.003125 update_engine[1658]: I20260114 01:19:36.003046 1658 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:19:36.003125 update_engine[1658]: I20260114 01:19:36.003137 1658 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:19:36.003578 update_engine[1658]: I20260114 01:19:36.003461 1658 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:19:36.003999 update_engine[1658]: E20260114 01:19:36.003928 1658 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:19:36.003999 update_engine[1658]: I20260114 01:19:36.003983 1658 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 14 01:19:40.992517 kubelet[2863]: E0114 01:19:40.992376 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:19:41.991815 kubelet[2863]: E0114 01:19:41.991734 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:19:42.991636 kubelet[2863]: E0114 01:19:42.991552 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:19:45.831133 systemd[1]: cri-containerd-b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5.scope: Deactivated successfully. Jan 14 01:19:45.832481 systemd[1]: cri-containerd-b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5.scope: Consumed 15.811s CPU time, 110.5M memory peak. Jan 14 01:19:45.845631 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:19:45.845773 kernel: audit: type=1334 audit(1768353585.833:858): prog-id=146 op=UNLOAD Jan 14 01:19:45.833000 audit: BPF prog-id=146 op=UNLOAD Jan 14 01:19:45.833000 audit: BPF prog-id=150 op=UNLOAD Jan 14 01:19:45.852988 containerd[1677]: time="2026-01-14T01:19:45.847768270Z" level=info msg="received container exit event container_id:\"b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5\" id:\"b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5\" pid:3193 exit_status:1 exited_at:{seconds:1768353585 nanos:845962425}" Jan 14 01:19:45.854322 kernel: audit: type=1334 audit(1768353585.833:859): prog-id=150 op=UNLOAD Jan 14 01:19:45.911280 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5-rootfs.mount: Deactivated successfully. Jan 14 01:19:45.991253 kubelet[2863]: E0114 01:19:45.991204 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:19:45.995537 update_engine[1658]: I20260114 01:19:45.995052 1658 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:19:45.995537 update_engine[1658]: I20260114 01:19:45.995131 1658 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:19:45.995537 update_engine[1658]: I20260114 01:19:45.995497 1658 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:19:45.996207 update_engine[1658]: E20260114 01:19:45.996140 1658 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:19:45.996296 update_engine[1658]: I20260114 01:19:45.996266 1658 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 14 01:19:46.283804 kubelet[2863]: E0114 01:19:46.283369 2863 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40950->10.0.0.2:2379: read: connection timed out" Jan 14 01:19:46.473697 kubelet[2863]: I0114 01:19:46.473665 2863 status_manager.go:895] "Failed to get status for pod" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40882->10.0.0.2:2379: read: connection timed out" Jan 14 01:19:46.474078 kubelet[2863]: E0114 01:19:46.473571 2863 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40774->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-kube-controllers-85b4f9c766-pwdc6.188a741bbb4b04ef calico-system 1524 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:calico-kube-controllers-85b4f9c766-pwdc6,UID:8ba22e1c-b895-4e68-8414-171d12dc9bef,APIVersion:v1,ResourceVersion:792,FieldPath:spec.containers{calico-kube-controllers},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4578-0-0-p-2c3a114250,},FirstTimestamp:2026-01-14 01:17:46 +0000 UTC,LastTimestamp:2026-01-14 01:19:35.991353245 +0000 UTC m=+151.126272548,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4578-0-0-p-2c3a114250,}" Jan 14 01:19:46.529313 kubelet[2863]: I0114 01:19:46.529055 2863 scope.go:117] "RemoveContainer" containerID="b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5" Jan 14 01:19:46.532549 containerd[1677]: time="2026-01-14T01:19:46.532469327Z" level=info msg="CreateContainer within sandbox \"4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 14 01:19:46.548778 containerd[1677]: time="2026-01-14T01:19:46.548620695Z" level=info msg="Container 2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:19:46.563685 containerd[1677]: time="2026-01-14T01:19:46.563527238Z" level=info msg="CreateContainer within sandbox \"4058a0cb4d2ea7cceb5288fe91e4480c9e9b93315fd74fa8ee470991bbb2810d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f\"" Jan 14 01:19:46.564611 containerd[1677]: time="2026-01-14T01:19:46.564563492Z" level=info msg="StartContainer for \"2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f\"" Jan 14 01:19:46.568042 containerd[1677]: time="2026-01-14T01:19:46.566597478Z" level=info msg="connecting to shim 2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f" address="unix:///run/containerd/s/ae10d5e669d8a94f38783b7b4a0ca9944c153ccc850bf49e719b33f66c2b4d47" protocol=ttrpc version=3 Jan 14 01:19:46.611187 systemd[1]: Started cri-containerd-2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f.scope - libcontainer container 2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f. Jan 14 01:19:46.623000 audit: BPF prog-id=256 op=LOAD Jan 14 01:19:46.626000 audit: BPF prog-id=257 op=LOAD Jan 14 01:19:46.630410 kernel: audit: type=1334 audit(1768353586.623:860): prog-id=256 op=LOAD Jan 14 01:19:46.630542 kernel: audit: type=1334 audit(1768353586.626:861): prog-id=257 op=LOAD Jan 14 01:19:46.626000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.637636 kernel: audit: type=1300 audit(1768353586.626:861): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.653828 kernel: audit: type=1327 audit(1768353586.626:861): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.626000 audit: BPF prog-id=257 op=UNLOAD Jan 14 01:19:46.626000 audit[5497]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.673132 kernel: audit: type=1334 audit(1768353586.626:862): prog-id=257 op=UNLOAD Jan 14 01:19:46.673892 kernel: audit: type=1300 audit(1768353586.626:862): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.682449 kernel: audit: type=1327 audit(1768353586.626:862): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.693990 kernel: audit: type=1334 audit(1768353586.626:863): prog-id=258 op=LOAD Jan 14 01:19:46.626000 audit: BPF prog-id=258 op=LOAD Jan 14 01:19:46.626000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.626000 audit: BPF prog-id=259 op=LOAD Jan 14 01:19:46.626000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.626000 audit: BPF prog-id=259 op=UNLOAD Jan 14 01:19:46.626000 audit[5497]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.626000 audit: BPF prog-id=258 op=UNLOAD Jan 14 01:19:46.626000 audit[5497]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.626000 audit: BPF prog-id=260 op=LOAD Jan 14 01:19:46.626000 audit[5497]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2966 pid=5497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:46.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234383961393938613666336134316364643737363839386534306532 Jan 14 01:19:46.699685 containerd[1677]: time="2026-01-14T01:19:46.699648557Z" level=info msg="StartContainer for \"2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f\" returns successfully" Jan 14 01:19:47.123000 audit: BPF prog-id=261 op=LOAD Jan 14 01:19:47.123000 audit: BPF prog-id=88 op=UNLOAD Jan 14 01:19:47.123151 systemd[1]: cri-containerd-3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200.scope: Deactivated successfully. Jan 14 01:19:47.125765 containerd[1677]: time="2026-01-14T01:19:47.123258624Z" level=info msg="received container exit event container_id:\"3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200\" id:\"3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200\" pid:2725 exit_status:1 exited_at:{seconds:1768353587 nanos:122819592}" Jan 14 01:19:47.123561 systemd[1]: cri-containerd-3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200.scope: Consumed 3.675s CPU time, 58.9M memory peak, 64K read from disk. Jan 14 01:19:47.126000 audit: BPF prog-id=103 op=UNLOAD Jan 14 01:19:47.126000 audit: BPF prog-id=107 op=UNLOAD Jan 14 01:19:47.156499 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200-rootfs.mount: Deactivated successfully. Jan 14 01:19:47.534447 kubelet[2863]: I0114 01:19:47.534364 2863 scope.go:117] "RemoveContainer" containerID="3a01b73acff305af9371a4d5ec7eddf8ffb54ffefc17182cde693be4fbd5e200" Jan 14 01:19:47.538633 containerd[1677]: time="2026-01-14T01:19:47.538486347Z" level=info msg="CreateContainer within sandbox \"f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 14 01:19:47.593853 containerd[1677]: time="2026-01-14T01:19:47.592245443Z" level=info msg="Container 6f731e09118a114f68981638dce33d4ca8f3f81027772a8fe6904daeac592cd7: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:19:47.608708 containerd[1677]: time="2026-01-14T01:19:47.608650331Z" level=info msg="CreateContainer within sandbox \"f50d312db7adc764fe1a571849551f6ed41d373d3eb542b2f1eb6f2bb385d99e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"6f731e09118a114f68981638dce33d4ca8f3f81027772a8fe6904daeac592cd7\"" Jan 14 01:19:47.609464 containerd[1677]: time="2026-01-14T01:19:47.609386533Z" level=info msg="StartContainer for \"6f731e09118a114f68981638dce33d4ca8f3f81027772a8fe6904daeac592cd7\"" Jan 14 01:19:47.611184 containerd[1677]: time="2026-01-14T01:19:47.611144478Z" level=info msg="connecting to shim 6f731e09118a114f68981638dce33d4ca8f3f81027772a8fe6904daeac592cd7" address="unix:///run/containerd/s/51f75a6a1db2f800d9509c36aa56baeb93cd8bb613eff19ecd20ca0273b33253" protocol=ttrpc version=3 Jan 14 01:19:47.653336 systemd[1]: Started cri-containerd-6f731e09118a114f68981638dce33d4ca8f3f81027772a8fe6904daeac592cd7.scope - libcontainer container 6f731e09118a114f68981638dce33d4ca8f3f81027772a8fe6904daeac592cd7. Jan 14 01:19:47.681000 audit: BPF prog-id=262 op=LOAD Jan 14 01:19:47.682000 audit: BPF prog-id=263 op=LOAD Jan 14 01:19:47.682000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2554 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:47.682000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666373331653039313138613131346636383938313633386463653333 Jan 14 01:19:47.682000 audit: BPF prog-id=263 op=UNLOAD Jan 14 01:19:47.682000 audit[5540]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:47.682000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666373331653039313138613131346636383938313633386463653333 Jan 14 01:19:47.682000 audit: BPF prog-id=264 op=LOAD Jan 14 01:19:47.682000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2554 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:47.682000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666373331653039313138613131346636383938313633386463653333 Jan 14 01:19:47.682000 audit: BPF prog-id=265 op=LOAD Jan 14 01:19:47.682000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2554 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:47.682000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666373331653039313138613131346636383938313633386463653333 Jan 14 01:19:47.683000 audit: BPF prog-id=265 op=UNLOAD Jan 14 01:19:47.683000 audit[5540]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:47.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666373331653039313138613131346636383938313633386463653333 Jan 14 01:19:47.683000 audit: BPF prog-id=264 op=UNLOAD Jan 14 01:19:47.683000 audit[5540]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2554 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:47.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666373331653039313138613131346636383938313633386463653333 Jan 14 01:19:47.683000 audit: BPF prog-id=266 op=LOAD Jan 14 01:19:47.683000 audit[5540]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2554 pid=5540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:47.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666373331653039313138613131346636383938313633386463653333 Jan 14 01:19:47.755851 containerd[1677]: time="2026-01-14T01:19:47.755681657Z" level=info msg="StartContainer for \"6f731e09118a114f68981638dce33d4ca8f3f81027772a8fe6904daeac592cd7\" returns successfully" Jan 14 01:19:47.992560 kubelet[2863]: E0114 01:19:47.992489 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:19:49.991567 kubelet[2863]: E0114 01:19:49.991463 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:19:51.355745 systemd[1]: cri-containerd-796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42.scope: Deactivated successfully. Jan 14 01:19:51.357983 systemd[1]: cri-containerd-796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42.scope: Consumed 1.826s CPU time, 21M memory peak, 184K read from disk. Jan 14 01:19:51.358905 containerd[1677]: time="2026-01-14T01:19:51.358792475Z" level=info msg="received container exit event container_id:\"796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42\" id:\"796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42\" pid:2713 exit_status:1 exited_at:{seconds:1768353591 nanos:357574751}" Jan 14 01:19:51.363081 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 14 01:19:51.363334 kernel: audit: type=1334 audit(1768353591.359:880): prog-id=108 op=UNLOAD Jan 14 01:19:51.359000 audit: BPF prog-id=108 op=UNLOAD Jan 14 01:19:51.359000 audit: BPF prog-id=112 op=UNLOAD Jan 14 01:19:51.366410 kernel: audit: type=1334 audit(1768353591.359:881): prog-id=112 op=UNLOAD Jan 14 01:19:51.360000 audit: BPF prog-id=267 op=LOAD Jan 14 01:19:51.360000 audit: BPF prog-id=93 op=UNLOAD Jan 14 01:19:51.372982 kernel: audit: type=1334 audit(1768353591.360:882): prog-id=267 op=LOAD Jan 14 01:19:51.373306 kernel: audit: type=1334 audit(1768353591.360:883): prog-id=93 op=UNLOAD Jan 14 01:19:51.400328 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42-rootfs.mount: Deactivated successfully. Jan 14 01:19:51.556320 kubelet[2863]: I0114 01:19:51.556056 2863 scope.go:117] "RemoveContainer" containerID="796563a9597a1445c5ef3119e8b1d115e398a1bf252f936ba687c307520e2a42" Jan 14 01:19:51.560460 containerd[1677]: time="2026-01-14T01:19:51.560388339Z" level=info msg="CreateContainer within sandbox \"f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 14 01:19:51.576528 containerd[1677]: time="2026-01-14T01:19:51.576445604Z" level=info msg="Container d62bdcbb71c6d25ad96738de9d57a0c8f30f2fe133466b26f92d8dc00a0a1f14: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:19:51.590649 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4262265156.mount: Deactivated successfully. Jan 14 01:19:51.592068 containerd[1677]: time="2026-01-14T01:19:51.591686617Z" level=info msg="CreateContainer within sandbox \"f54c901d5815fc6e5ce49ff2b451b86d6e32687779cf938971514ba389080a30\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"d62bdcbb71c6d25ad96738de9d57a0c8f30f2fe133466b26f92d8dc00a0a1f14\"" Jan 14 01:19:51.593767 containerd[1677]: time="2026-01-14T01:19:51.593273671Z" level=info msg="StartContainer for \"d62bdcbb71c6d25ad96738de9d57a0c8f30f2fe133466b26f92d8dc00a0a1f14\"" Jan 14 01:19:51.595560 containerd[1677]: time="2026-01-14T01:19:51.595491107Z" level=info msg="connecting to shim d62bdcbb71c6d25ad96738de9d57a0c8f30f2fe133466b26f92d8dc00a0a1f14" address="unix:///run/containerd/s/b81c031bfb5f4f100c3fd2621c451b66b36fdba020714d83593f519832a988f3" protocol=ttrpc version=3 Jan 14 01:19:51.638387 systemd[1]: Started cri-containerd-d62bdcbb71c6d25ad96738de9d57a0c8f30f2fe133466b26f92d8dc00a0a1f14.scope - libcontainer container d62bdcbb71c6d25ad96738de9d57a0c8f30f2fe133466b26f92d8dc00a0a1f14. Jan 14 01:19:51.666000 audit: BPF prog-id=268 op=LOAD Jan 14 01:19:51.672061 kernel: audit: type=1334 audit(1768353591.666:884): prog-id=268 op=LOAD Jan 14 01:19:51.671000 audit: BPF prog-id=269 op=LOAD Jan 14 01:19:51.678300 kernel: audit: type=1334 audit(1768353591.671:885): prog-id=269 op=LOAD Jan 14 01:19:51.693284 kernel: audit: type=1300 audit(1768353591.671:885): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.707158 kernel: audit: type=1327 audit(1768353591.671:885): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.671000 audit: BPF prog-id=269 op=UNLOAD Jan 14 01:19:51.721211 kernel: audit: type=1334 audit(1768353591.671:886): prog-id=269 op=UNLOAD Jan 14 01:19:51.721358 kernel: audit: type=1300 audit(1768353591.671:886): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit[5591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.671000 audit: BPF prog-id=270 op=LOAD Jan 14 01:19:51.671000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.671000 audit: BPF prog-id=271 op=LOAD Jan 14 01:19:51.671000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.671000 audit: BPF prog-id=271 op=UNLOAD Jan 14 01:19:51.671000 audit[5591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.671000 audit: BPF prog-id=270 op=UNLOAD Jan 14 01:19:51.671000 audit[5591]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.671000 audit: BPF prog-id=272 op=LOAD Jan 14 01:19:51.671000 audit[5591]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2586 pid=5591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:19:51.671000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436326264636262373163366432356164393637333864653964353761 Jan 14 01:19:51.754977 containerd[1677]: time="2026-01-14T01:19:51.754904894Z" level=info msg="StartContainer for \"d62bdcbb71c6d25ad96738de9d57a0c8f30f2fe133466b26f92d8dc00a0a1f14\" returns successfully" Jan 14 01:19:52.992507 kubelet[2863]: E0114 01:19:52.992344 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-w8fr4" podUID="e9b7dca9-0cc9-40e4-b746-163e923a9fd3" Jan 14 01:19:53.991841 kubelet[2863]: E0114 01:19:53.991777 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc" Jan 14 01:19:54.991162 kubelet[2863]: E0114 01:19:54.991089 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-76bdf97696-8ln4x" podUID="e35dc45d-4646-4773-8f3b-b3b00ec76393" Jan 14 01:19:55.998502 update_engine[1658]: I20260114 01:19:55.998384 1658 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:19:55.998996 update_engine[1658]: I20260114 01:19:55.998535 1658 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:19:55.999473 update_engine[1658]: I20260114 01:19:55.999406 1658 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:19:55.999739 update_engine[1658]: E20260114 01:19:55.999707 1658 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:19:55.999806 update_engine[1658]: I20260114 01:19:55.999786 1658 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 01:19:55.999806 update_engine[1658]: I20260114 01:19:55.999799 1658 omaha_request_action.cc:617] Omaha request response: Jan 14 01:19:55.999903 update_engine[1658]: E20260114 01:19:55.999884 1658 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 14 01:19:55.999927 update_engine[1658]: I20260114 01:19:55.999907 1658 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 14 01:19:55.999927 update_engine[1658]: I20260114 01:19:55.999912 1658 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:19:55.999927 update_engine[1658]: I20260114 01:19:55.999922 1658 update_attempter.cc:306] Processing Done. Jan 14 01:19:55.999974 update_engine[1658]: E20260114 01:19:55.999934 1658 update_attempter.cc:619] Update failed. Jan 14 01:19:55.999974 update_engine[1658]: I20260114 01:19:55.999940 1658 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 14 01:19:55.999974 update_engine[1658]: I20260114 01:19:55.999945 1658 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 14 01:19:55.999974 update_engine[1658]: I20260114 01:19:55.999951 1658 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 14 01:19:56.000057 update_engine[1658]: I20260114 01:19:56.000030 1658 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 01:19:56.000057 update_engine[1658]: I20260114 01:19:56.000049 1658 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 01:19:56.000087 update_engine[1658]: I20260114 01:19:56.000056 1658 omaha_request_action.cc:272] Request: Jan 14 01:19:56.000087 update_engine[1658]: Jan 14 01:19:56.000087 update_engine[1658]: Jan 14 01:19:56.000087 update_engine[1658]: Jan 14 01:19:56.000087 update_engine[1658]: Jan 14 01:19:56.000087 update_engine[1658]: Jan 14 01:19:56.000087 update_engine[1658]: Jan 14 01:19:56.000087 update_engine[1658]: I20260114 01:19:56.000061 1658 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 01:19:56.000087 update_engine[1658]: I20260114 01:19:56.000078 1658 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 01:19:56.000340 update_engine[1658]: I20260114 01:19:56.000315 1658 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 01:19:56.000680 update_engine[1658]: E20260114 01:19:56.000658 1658 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 01:19:56.000739 locksmithd[1706]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 14 01:19:56.001080 update_engine[1658]: I20260114 01:19:56.001052 1658 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 01:19:56.001080 update_engine[1658]: I20260114 01:19:56.001067 1658 omaha_request_action.cc:617] Omaha request response: Jan 14 01:19:56.001080 update_engine[1658]: I20260114 01:19:56.001075 1658 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:19:56.001080 update_engine[1658]: I20260114 01:19:56.001080 1658 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 01:19:56.001200 update_engine[1658]: I20260114 01:19:56.001085 1658 update_attempter.cc:306] Processing Done. Jan 14 01:19:56.001200 update_engine[1658]: I20260114 01:19:56.001091 1658 update_attempter.cc:310] Error event sent. Jan 14 01:19:56.001200 update_engine[1658]: I20260114 01:19:56.001099 1658 update_check_scheduler.cc:74] Next update check in 49m29s Jan 14 01:19:56.001403 locksmithd[1706]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 14 01:19:56.284918 kubelet[2863]: E0114 01:19:56.284567 2863 controller.go:195] "Failed to update lease" err="Put \"https://77.42.79.167:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-2c3a114250?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 14 01:19:57.950205 systemd[1]: cri-containerd-2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f.scope: Deactivated successfully. Jan 14 01:19:57.952000 audit: BPF prog-id=256 op=UNLOAD Jan 14 01:19:57.955333 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 14 01:19:57.955476 kernel: audit: type=1334 audit(1768353597.952:892): prog-id=256 op=UNLOAD Jan 14 01:19:57.952000 audit: BPF prog-id=260 op=UNLOAD Jan 14 01:19:57.962920 containerd[1677]: time="2026-01-14T01:19:57.962825682Z" level=info msg="received container exit event container_id:\"2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f\" id:\"2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f\" pid:5509 exit_status:1 exited_at:{seconds:1768353597 nanos:953214497}" Jan 14 01:19:57.967249 kernel: audit: type=1334 audit(1768353597.952:893): prog-id=260 op=UNLOAD Jan 14 01:19:58.013407 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f-rootfs.mount: Deactivated successfully. Jan 14 01:19:58.589223 kubelet[2863]: I0114 01:19:58.588848 2863 scope.go:117] "RemoveContainer" containerID="b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5" Jan 14 01:19:58.590965 kubelet[2863]: I0114 01:19:58.589328 2863 scope.go:117] "RemoveContainer" containerID="2489a998a6f3a41cdd776898e40e2cec013aef19573837879172b3b0a8b35a6f" Jan 14 01:19:58.590965 kubelet[2863]: E0114 01:19:58.589528 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-rblzk_tigera-operator(1a843e5a-8b6c-4213-b568-9d2f828dc754)\"" pod="tigera-operator/tigera-operator-7dcd859c48-rblzk" podUID="1a843e5a-8b6c-4213-b568-9d2f828dc754" Jan 14 01:19:58.591845 containerd[1677]: time="2026-01-14T01:19:58.591794972Z" level=info msg="RemoveContainer for \"b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5\"" Jan 14 01:19:58.599424 containerd[1677]: time="2026-01-14T01:19:58.599228771Z" level=info msg="RemoveContainer for \"b93ad2b06ddc57d796e4639b414d9ec23373cde56e82819f3e8b842ea6f487a5\" returns successfully" Jan 14 01:19:58.991603 kubelet[2863]: E0114 01:19:58.991397 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-l6m4k" podUID="a11f31d9-176c-489f-9754-8429b5bd5389" Jan 14 01:20:01.992604 kubelet[2863]: E0114 01:20:01.992504 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pj96q" podUID="f9f7ed9b-3d95-4d57-9bbf-9bd4bf98db1b" Jan 14 01:20:03.990815 kubelet[2863]: E0114 01:20:03.990740 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-85b4f9c766-pwdc6" podUID="8ba22e1c-b895-4e68-8414-171d12dc9bef" Jan 14 01:20:04.992246 kubelet[2863]: E0114 01:20:04.992194 2863 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-75bdffd95c-dllbx" podUID="0c40b23d-843d-4125-8dd6-0c99d44bb1dc"