Jan 23 01:04:37.890640 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 22:22:03 -00 2026 Jan 23 01:04:37.890677 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=e8d7116310bea9a494780b8becdce41e7cc03ed509d8e2363e08981a47b3edc6 Jan 23 01:04:37.890689 kernel: BIOS-provided physical RAM map: Jan 23 01:04:37.890698 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 23 01:04:37.890705 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 23 01:04:37.890713 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 23 01:04:37.890729 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 23 01:04:37.890743 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 23 01:04:37.890751 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 23 01:04:37.890759 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 23 01:04:37.890767 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 23 01:04:37.890775 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 23 01:04:37.890782 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 23 01:04:37.890790 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 23 01:04:37.890803 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 23 01:04:37.890811 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 23 01:04:37.890820 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 23 01:04:37.890828 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 23 01:04:37.890836 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 23 01:04:37.890844 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 23 01:04:37.890852 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 23 01:04:37.890863 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 23 01:04:37.890871 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 23 01:04:37.890879 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 23 01:04:37.890887 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 23 01:04:37.890895 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 23 01:04:37.890903 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 23 01:04:37.890911 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 23 01:04:37.890919 kernel: NX (Execute Disable) protection: active Jan 23 01:04:37.890927 kernel: APIC: Static calls initialized Jan 23 01:04:37.890935 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 23 01:04:37.890944 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 23 01:04:37.890952 kernel: extended physical RAM map: Jan 23 01:04:37.890963 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 23 01:04:37.890971 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 23 01:04:37.890979 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 23 01:04:37.890987 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 23 01:04:37.890996 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 23 01:04:37.891004 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 23 01:04:37.891012 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 23 01:04:37.891025 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 23 01:04:37.891035 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 23 01:04:37.891044 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 23 01:04:37.891053 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 23 01:04:37.891061 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 23 01:04:37.891070 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 23 01:04:37.891078 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 23 01:04:37.891087 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 23 01:04:37.891098 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 23 01:04:37.891107 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 23 01:04:37.891115 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 23 01:04:37.891124 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 23 01:04:37.891133 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 23 01:04:37.891141 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 23 01:04:37.891150 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 23 01:04:37.891158 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 23 01:04:37.891167 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 23 01:04:37.891176 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 23 01:04:37.891185 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 23 01:04:37.891196 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 23 01:04:37.891204 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 23 01:04:37.891213 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 23 01:04:37.891222 kernel: efi: EFI v2.7 by EDK II Jan 23 01:04:37.891230 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 23 01:04:37.891239 kernel: random: crng init done Jan 23 01:04:37.891248 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 23 01:04:37.891257 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 23 01:04:37.891265 kernel: secureboot: Secure boot disabled Jan 23 01:04:37.891274 kernel: SMBIOS 2.8 present. Jan 23 01:04:37.891283 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 23 01:04:37.891291 kernel: DMI: Memory slots populated: 1/1 Jan 23 01:04:37.891302 kernel: Hypervisor detected: KVM Jan 23 01:04:37.891311 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 23 01:04:37.891319 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 23 01:04:37.891328 kernel: kvm-clock: using sched offset of 7180383828 cycles Jan 23 01:04:37.891337 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 23 01:04:37.891346 kernel: tsc: Detected 2294.608 MHz processor Jan 23 01:04:37.891356 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 23 01:04:37.891365 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 23 01:04:37.891373 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 23 01:04:37.891383 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 23 01:04:37.891394 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 23 01:04:37.891403 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 23 01:04:37.891412 kernel: Using GB pages for direct mapping Jan 23 01:04:37.891421 kernel: ACPI: Early table checksum verification disabled Jan 23 01:04:37.891430 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 23 01:04:37.891439 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 23 01:04:37.891448 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 01:04:37.891457 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 01:04:37.891466 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 23 01:04:37.891478 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 01:04:37.891487 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 01:04:37.891495 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 01:04:37.891504 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 23 01:04:37.891524 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 23 01:04:37.891533 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 23 01:04:37.891542 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 23 01:04:37.891551 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 23 01:04:37.891561 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 23 01:04:37.891572 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 23 01:04:37.891582 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 23 01:04:37.891601 kernel: No NUMA configuration found Jan 23 01:04:37.891610 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 23 01:04:37.891619 kernel: NODE_DATA(0) allocated [mem 0x17fff6dc0-0x17fffdfff] Jan 23 01:04:37.891629 kernel: Zone ranges: Jan 23 01:04:37.891638 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 23 01:04:37.891647 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 23 01:04:37.891656 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 23 01:04:37.891667 kernel: Device empty Jan 23 01:04:37.891676 kernel: Movable zone start for each node Jan 23 01:04:37.891685 kernel: Early memory node ranges Jan 23 01:04:37.891694 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 23 01:04:37.891703 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 23 01:04:37.891712 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 23 01:04:37.891720 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 23 01:04:37.891729 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 23 01:04:37.891738 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 23 01:04:37.891747 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 23 01:04:37.891767 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 23 01:04:37.891777 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 23 01:04:37.891788 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 23 01:04:37.891798 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 23 01:04:37.891808 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 23 01:04:37.891817 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 23 01:04:37.891827 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 23 01:04:37.891838 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 23 01:04:37.891849 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 23 01:04:37.891859 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 23 01:04:37.891869 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 23 01:04:37.891878 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 23 01:04:37.891888 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 23 01:04:37.891898 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 23 01:04:37.891908 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 23 01:04:37.891918 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 23 01:04:37.891928 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 23 01:04:37.891941 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 23 01:04:37.891951 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 23 01:04:37.891960 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 23 01:04:37.891970 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 23 01:04:37.891980 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 23 01:04:37.891990 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 23 01:04:37.891999 kernel: TSC deadline timer available Jan 23 01:04:37.892009 kernel: CPU topo: Max. logical packages: 2 Jan 23 01:04:37.892019 kernel: CPU topo: Max. logical dies: 2 Jan 23 01:04:37.892031 kernel: CPU topo: Max. dies per package: 1 Jan 23 01:04:37.892041 kernel: CPU topo: Max. threads per core: 1 Jan 23 01:04:37.892051 kernel: CPU topo: Num. cores per package: 1 Jan 23 01:04:37.892060 kernel: CPU topo: Num. threads per package: 1 Jan 23 01:04:37.892070 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 23 01:04:37.892080 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 23 01:04:37.892090 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 23 01:04:37.892099 kernel: kvm-guest: setup PV sched yield Jan 23 01:04:37.892109 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 23 01:04:37.892121 kernel: Booting paravirtualized kernel on KVM Jan 23 01:04:37.892131 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 23 01:04:37.892141 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 23 01:04:37.892151 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 23 01:04:37.892165 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 23 01:04:37.892191 kernel: pcpu-alloc: [0] 0 1 Jan 23 01:04:37.892201 kernel: kvm-guest: PV spinlocks enabled Jan 23 01:04:37.892216 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 23 01:04:37.892236 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=e8d7116310bea9a494780b8becdce41e7cc03ed509d8e2363e08981a47b3edc6 Jan 23 01:04:37.892249 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 23 01:04:37.892259 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 23 01:04:37.892269 kernel: Fallback order for Node 0: 0 Jan 23 01:04:37.892284 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 23 01:04:37.892295 kernel: Policy zone: Normal Jan 23 01:04:37.892305 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 23 01:04:37.892315 kernel: software IO TLB: area num 2. Jan 23 01:04:37.892325 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 23 01:04:37.892337 kernel: ftrace: allocating 40097 entries in 157 pages Jan 23 01:04:37.892347 kernel: ftrace: allocated 157 pages with 5 groups Jan 23 01:04:37.892357 kernel: Dynamic Preempt: voluntary Jan 23 01:04:37.892366 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 23 01:04:37.892377 kernel: rcu: RCU event tracing is enabled. Jan 23 01:04:37.892388 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 23 01:04:37.892397 kernel: Trampoline variant of Tasks RCU enabled. Jan 23 01:04:37.892407 kernel: Rude variant of Tasks RCU enabled. Jan 23 01:04:37.892417 kernel: Tracing variant of Tasks RCU enabled. Jan 23 01:04:37.892427 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 23 01:04:37.892439 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 23 01:04:37.892449 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 01:04:37.892459 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 01:04:37.892469 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 01:04:37.892479 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 23 01:04:37.892489 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 23 01:04:37.892499 kernel: Console: colour dummy device 80x25 Jan 23 01:04:37.894537 kernel: printk: legacy console [tty0] enabled Jan 23 01:04:37.894568 kernel: printk: legacy console [ttyS0] enabled Jan 23 01:04:37.894578 kernel: ACPI: Core revision 20240827 Jan 23 01:04:37.894588 kernel: APIC: Switch to symmetric I/O mode setup Jan 23 01:04:37.894597 kernel: x2apic enabled Jan 23 01:04:37.894607 kernel: APIC: Switched APIC routing to: physical x2apic Jan 23 01:04:37.894616 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 23 01:04:37.894626 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 23 01:04:37.894636 kernel: kvm-guest: setup PV IPIs Jan 23 01:04:37.894645 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 23 01:04:37.894657 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Jan 23 01:04:37.894667 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 23 01:04:37.894676 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 23 01:04:37.894690 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 23 01:04:37.894704 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 23 01:04:37.894714 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 23 01:04:37.894723 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 23 01:04:37.894732 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 23 01:04:37.894746 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 23 01:04:37.894756 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 23 01:04:37.894765 kernel: TAA: Mitigation: Clear CPU buffers Jan 23 01:04:37.894777 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 23 01:04:37.894787 kernel: active return thunk: its_return_thunk Jan 23 01:04:37.894796 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 23 01:04:37.894805 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 23 01:04:37.894814 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 23 01:04:37.894823 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 23 01:04:37.894833 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 23 01:04:37.894842 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 23 01:04:37.894851 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 23 01:04:37.894860 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 23 01:04:37.894872 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 23 01:04:37.894882 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 23 01:04:37.894891 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 23 01:04:37.894900 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 23 01:04:37.894909 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 23 01:04:37.894918 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 23 01:04:37.894928 kernel: Freeing SMP alternatives memory: 32K Jan 23 01:04:37.894937 kernel: pid_max: default: 32768 minimum: 301 Jan 23 01:04:37.894946 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 23 01:04:37.894955 kernel: landlock: Up and running. Jan 23 01:04:37.894964 kernel: SELinux: Initializing. Jan 23 01:04:37.894973 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 23 01:04:37.894985 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 23 01:04:37.894994 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 23 01:04:37.895003 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 23 01:04:37.895013 kernel: ... version: 2 Jan 23 01:04:37.895023 kernel: ... bit width: 48 Jan 23 01:04:37.895032 kernel: ... generic registers: 8 Jan 23 01:04:37.895042 kernel: ... value mask: 0000ffffffffffff Jan 23 01:04:37.895052 kernel: ... max period: 00007fffffffffff Jan 23 01:04:37.895061 kernel: ... fixed-purpose events: 3 Jan 23 01:04:37.895071 kernel: ... event mask: 00000007000000ff Jan 23 01:04:37.895083 kernel: signal: max sigframe size: 3632 Jan 23 01:04:37.895092 kernel: rcu: Hierarchical SRCU implementation. Jan 23 01:04:37.895103 kernel: rcu: Max phase no-delay instances is 400. Jan 23 01:04:37.895112 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 23 01:04:37.895122 kernel: smp: Bringing up secondary CPUs ... Jan 23 01:04:37.895131 kernel: smpboot: x86: Booting SMP configuration: Jan 23 01:04:37.895141 kernel: .... node #0, CPUs: #1 Jan 23 01:04:37.895150 kernel: smp: Brought up 1 node, 2 CPUs Jan 23 01:04:37.895160 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Jan 23 01:04:37.895173 kernel: Memory: 3945188K/4186776K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46196K init, 2564K bss, 236712K reserved, 0K cma-reserved) Jan 23 01:04:37.895182 kernel: devtmpfs: initialized Jan 23 01:04:37.895192 kernel: x86/mm: Memory block size: 128MB Jan 23 01:04:37.895201 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 23 01:04:37.895211 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 23 01:04:37.895221 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 23 01:04:37.895230 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 23 01:04:37.895240 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 23 01:04:37.895249 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 23 01:04:37.895261 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 23 01:04:37.895271 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 23 01:04:37.895280 kernel: pinctrl core: initialized pinctrl subsystem Jan 23 01:04:37.895290 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 23 01:04:37.895299 kernel: audit: initializing netlink subsys (disabled) Jan 23 01:04:37.895309 kernel: audit: type=2000 audit(1769130273.997:1): state=initialized audit_enabled=0 res=1 Jan 23 01:04:37.895319 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 23 01:04:37.895328 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 23 01:04:37.895340 kernel: cpuidle: using governor menu Jan 23 01:04:37.895350 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 23 01:04:37.895359 kernel: dca service started, version 1.12.1 Jan 23 01:04:37.895369 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 23 01:04:37.895378 kernel: PCI: Using configuration type 1 for base access Jan 23 01:04:37.895388 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 23 01:04:37.895398 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 23 01:04:37.895407 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 23 01:04:37.895416 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 23 01:04:37.895428 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 23 01:04:37.895437 kernel: ACPI: Added _OSI(Module Device) Jan 23 01:04:37.895447 kernel: ACPI: Added _OSI(Processor Device) Jan 23 01:04:37.895456 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 23 01:04:37.895466 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 23 01:04:37.895475 kernel: ACPI: Interpreter enabled Jan 23 01:04:37.895485 kernel: ACPI: PM: (supports S0 S3 S5) Jan 23 01:04:37.895494 kernel: ACPI: Using IOAPIC for interrupt routing Jan 23 01:04:37.895504 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 23 01:04:37.895522 kernel: PCI: Using E820 reservations for host bridge windows Jan 23 01:04:37.895534 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 23 01:04:37.895544 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 23 01:04:37.895716 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 23 01:04:37.895812 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 23 01:04:37.895899 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 23 01:04:37.895911 kernel: PCI host bridge to bus 0000:00 Jan 23 01:04:37.896001 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 23 01:04:37.896084 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 23 01:04:37.896161 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 23 01:04:37.896237 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 23 01:04:37.896312 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 23 01:04:37.896389 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 23 01:04:37.896466 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 23 01:04:37.896596 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 23 01:04:37.896698 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 23 01:04:37.896789 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 23 01:04:37.896879 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 23 01:04:37.896967 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 23 01:04:37.897055 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 23 01:04:37.897142 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 23 01:04:37.897241 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.897329 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 23 01:04:37.897417 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 23 01:04:37.897504 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 23 01:04:37.898633 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 23 01:04:37.898731 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 01:04:37.898824 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.898911 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 23 01:04:37.898991 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 23 01:04:37.899071 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 23 01:04:37.899150 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 23 01:04:37.899239 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.899320 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 23 01:04:37.899403 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 23 01:04:37.899481 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 23 01:04:37.899641 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 23 01:04:37.899729 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.899846 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 23 01:04:37.899928 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 23 01:04:37.900007 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 23 01:04:37.900090 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 23 01:04:37.900178 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.900258 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 23 01:04:37.900338 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 23 01:04:37.900416 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 23 01:04:37.900494 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 23 01:04:37.903650 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.903752 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 23 01:04:37.903833 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 23 01:04:37.903913 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 23 01:04:37.903991 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 23 01:04:37.904077 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.904156 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 23 01:04:37.904236 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 23 01:04:37.904313 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 23 01:04:37.904391 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 23 01:04:37.904476 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.904595 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 23 01:04:37.904673 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 23 01:04:37.904750 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 23 01:04:37.904827 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 23 01:04:37.904917 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.904996 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 23 01:04:37.905073 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 23 01:04:37.905150 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 23 01:04:37.905227 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 23 01:04:37.905311 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.905406 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 23 01:04:37.905488 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 23 01:04:37.905923 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 23 01:04:37.906006 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 23 01:04:37.906110 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.906185 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 23 01:04:37.906258 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 23 01:04:37.906332 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 23 01:04:37.906410 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 23 01:04:37.906493 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.907287 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 23 01:04:37.907381 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 23 01:04:37.907461 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 23 01:04:37.907549 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 23 01:04:37.907632 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.907708 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 23 01:04:37.907781 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 23 01:04:37.907855 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 23 01:04:37.907928 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 23 01:04:37.908010 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.908087 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 23 01:04:37.908160 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 23 01:04:37.908235 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 23 01:04:37.908309 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 23 01:04:37.908392 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.908467 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 23 01:04:37.908552 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 23 01:04:37.908629 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 23 01:04:37.908702 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 23 01:04:37.908781 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.908855 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 23 01:04:37.908928 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 23 01:04:37.909000 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 23 01:04:37.909073 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 23 01:04:37.909156 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.909230 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 23 01:04:37.909304 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 23 01:04:37.909377 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 23 01:04:37.909449 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 23 01:04:37.910864 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.910955 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 23 01:04:37.911031 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 23 01:04:37.911103 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 23 01:04:37.911175 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 23 01:04:37.911253 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.911325 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 23 01:04:37.911401 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 23 01:04:37.911485 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 23 01:04:37.911585 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 23 01:04:37.911667 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.911740 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 23 01:04:37.911812 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 23 01:04:37.911883 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 23 01:04:37.911953 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 23 01:04:37.912029 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.912104 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 23 01:04:37.912174 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 23 01:04:37.912245 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 23 01:04:37.912315 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 23 01:04:37.912391 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.912463 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 23 01:04:37.913588 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 23 01:04:37.913678 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 23 01:04:37.913756 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 23 01:04:37.913837 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.913911 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 23 01:04:37.914013 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 23 01:04:37.914099 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 23 01:04:37.914171 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 23 01:04:37.914252 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.914324 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 23 01:04:37.914395 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 23 01:04:37.914482 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 23 01:04:37.914591 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 23 01:04:37.914669 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.914740 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 23 01:04:37.915984 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 23 01:04:37.916063 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 23 01:04:37.916133 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 23 01:04:37.916211 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.916282 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 23 01:04:37.916353 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 23 01:04:37.916421 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 23 01:04:37.916489 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 23 01:04:37.916598 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.916668 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 23 01:04:37.916736 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 23 01:04:37.916805 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 23 01:04:37.916875 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 23 01:04:37.916950 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.917020 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 23 01:04:37.917088 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 23 01:04:37.917158 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 23 01:04:37.917226 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 23 01:04:37.917303 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 01:04:37.917375 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 23 01:04:37.917444 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 23 01:04:37.919532 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 23 01:04:37.919650 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 23 01:04:37.919770 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 23 01:04:37.920273 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 23 01:04:37.920368 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 23 01:04:37.920445 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 23 01:04:37.920528 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 23 01:04:37.920608 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 23 01:04:37.920679 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 23 01:04:37.920761 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 23 01:04:37.920833 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 23 01:04:37.920907 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 23 01:04:37.920977 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 23 01:04:37.921048 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 23 01:04:37.921118 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 01:04:37.921191 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 23 01:04:37.921285 kernel: pci_bus 0000:02: extended config space not accessible Jan 23 01:04:37.921298 kernel: acpiphp: Slot [1] registered Jan 23 01:04:37.921306 kernel: acpiphp: Slot [0] registered Jan 23 01:04:37.921317 kernel: acpiphp: Slot [2] registered Jan 23 01:04:37.921325 kernel: acpiphp: Slot [3] registered Jan 23 01:04:37.921332 kernel: acpiphp: Slot [4] registered Jan 23 01:04:37.921340 kernel: acpiphp: Slot [5] registered Jan 23 01:04:37.921348 kernel: acpiphp: Slot [6] registered Jan 23 01:04:37.921356 kernel: acpiphp: Slot [7] registered Jan 23 01:04:37.921363 kernel: acpiphp: Slot [8] registered Jan 23 01:04:37.921371 kernel: acpiphp: Slot [9] registered Jan 23 01:04:37.921378 kernel: acpiphp: Slot [10] registered Jan 23 01:04:37.921386 kernel: acpiphp: Slot [11] registered Jan 23 01:04:37.921396 kernel: acpiphp: Slot [12] registered Jan 23 01:04:37.921404 kernel: acpiphp: Slot [13] registered Jan 23 01:04:37.921412 kernel: acpiphp: Slot [14] registered Jan 23 01:04:37.921419 kernel: acpiphp: Slot [15] registered Jan 23 01:04:37.921427 kernel: acpiphp: Slot [16] registered Jan 23 01:04:37.921435 kernel: acpiphp: Slot [17] registered Jan 23 01:04:37.921442 kernel: acpiphp: Slot [18] registered Jan 23 01:04:37.921450 kernel: acpiphp: Slot [19] registered Jan 23 01:04:37.921458 kernel: acpiphp: Slot [20] registered Jan 23 01:04:37.921467 kernel: acpiphp: Slot [21] registered Jan 23 01:04:37.921475 kernel: acpiphp: Slot [22] registered Jan 23 01:04:37.921482 kernel: acpiphp: Slot [23] registered Jan 23 01:04:37.921490 kernel: acpiphp: Slot [24] registered Jan 23 01:04:37.921498 kernel: acpiphp: Slot [25] registered Jan 23 01:04:37.921506 kernel: acpiphp: Slot [26] registered Jan 23 01:04:37.921527 kernel: acpiphp: Slot [27] registered Jan 23 01:04:37.921535 kernel: acpiphp: Slot [28] registered Jan 23 01:04:37.921543 kernel: acpiphp: Slot [29] registered Jan 23 01:04:37.921553 kernel: acpiphp: Slot [30] registered Jan 23 01:04:37.921563 kernel: acpiphp: Slot [31] registered Jan 23 01:04:37.921645 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 23 01:04:37.921720 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 23 01:04:37.921792 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 23 01:04:37.921802 kernel: acpiphp: Slot [0-2] registered Jan 23 01:04:37.921882 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 23 01:04:37.921955 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 23 01:04:37.922043 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 23 01:04:37.922114 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 23 01:04:37.922185 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 23 01:04:37.922196 kernel: acpiphp: Slot [0-3] registered Jan 23 01:04:37.922272 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 23 01:04:37.922346 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 23 01:04:37.922418 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 23 01:04:37.922490 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 23 01:04:37.922501 kernel: acpiphp: Slot [0-4] registered Jan 23 01:04:37.924631 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 23 01:04:37.924716 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 23 01:04:37.924791 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 23 01:04:37.924803 kernel: acpiphp: Slot [0-5] registered Jan 23 01:04:37.924883 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 23 01:04:37.924962 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 23 01:04:37.925035 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 23 01:04:37.925107 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 23 01:04:37.925118 kernel: acpiphp: Slot [0-6] registered Jan 23 01:04:37.925187 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 23 01:04:37.925198 kernel: acpiphp: Slot [0-7] registered Jan 23 01:04:37.925269 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 23 01:04:37.925280 kernel: acpiphp: Slot [0-8] registered Jan 23 01:04:37.925352 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 23 01:04:37.925363 kernel: acpiphp: Slot [0-9] registered Jan 23 01:04:37.925434 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 23 01:04:37.925445 kernel: acpiphp: Slot [0-10] registered Jan 23 01:04:37.925527 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 23 01:04:37.925537 kernel: acpiphp: Slot [0-11] registered Jan 23 01:04:37.925607 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 23 01:04:37.925618 kernel: acpiphp: Slot [0-12] registered Jan 23 01:04:37.925690 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 23 01:04:37.925700 kernel: acpiphp: Slot [0-13] registered Jan 23 01:04:37.925769 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 23 01:04:37.925780 kernel: acpiphp: Slot [0-14] registered Jan 23 01:04:37.925849 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 23 01:04:37.925860 kernel: acpiphp: Slot [0-15] registered Jan 23 01:04:37.925929 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 23 01:04:37.925940 kernel: acpiphp: Slot [0-16] registered Jan 23 01:04:37.926012 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 23 01:04:37.926035 kernel: acpiphp: Slot [0-17] registered Jan 23 01:04:37.926107 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 23 01:04:37.926118 kernel: acpiphp: Slot [0-18] registered Jan 23 01:04:37.926186 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 23 01:04:37.926196 kernel: acpiphp: Slot [0-19] registered Jan 23 01:04:37.926265 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 23 01:04:37.926278 kernel: acpiphp: Slot [0-20] registered Jan 23 01:04:37.926348 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 23 01:04:37.926359 kernel: acpiphp: Slot [0-21] registered Jan 23 01:04:37.926429 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 23 01:04:37.926439 kernel: acpiphp: Slot [0-22] registered Jan 23 01:04:37.926507 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 23 01:04:37.927898 kernel: acpiphp: Slot [0-23] registered Jan 23 01:04:37.928001 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 23 01:04:37.928016 kernel: acpiphp: Slot [0-24] registered Jan 23 01:04:37.928088 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 23 01:04:37.928099 kernel: acpiphp: Slot [0-25] registered Jan 23 01:04:37.928168 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 23 01:04:37.928178 kernel: acpiphp: Slot [0-26] registered Jan 23 01:04:37.928248 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 23 01:04:37.928259 kernel: acpiphp: Slot [0-27] registered Jan 23 01:04:37.928328 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 23 01:04:37.928342 kernel: acpiphp: Slot [0-28] registered Jan 23 01:04:37.928411 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 23 01:04:37.928421 kernel: acpiphp: Slot [0-29] registered Jan 23 01:04:37.928489 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 23 01:04:37.928499 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 23 01:04:37.928507 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 23 01:04:37.928536 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 23 01:04:37.928544 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 23 01:04:37.928552 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 23 01:04:37.928563 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 23 01:04:37.928570 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 23 01:04:37.928578 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 23 01:04:37.928587 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 23 01:04:37.928595 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 23 01:04:37.928603 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 23 01:04:37.928610 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 23 01:04:37.928618 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 23 01:04:37.928626 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 23 01:04:37.928636 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 23 01:04:37.928644 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 23 01:04:37.928652 kernel: iommu: Default domain type: Translated Jan 23 01:04:37.928659 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 23 01:04:37.928667 kernel: efivars: Registered efivars operations Jan 23 01:04:37.928675 kernel: PCI: Using ACPI for IRQ routing Jan 23 01:04:37.928683 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 23 01:04:37.928691 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 23 01:04:37.928698 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 23 01:04:37.928708 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 23 01:04:37.928715 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 23 01:04:37.928723 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 23 01:04:37.928730 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 23 01:04:37.928738 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 23 01:04:37.928746 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 23 01:04:37.928753 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 23 01:04:37.928831 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 23 01:04:37.928905 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 23 01:04:37.928974 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 23 01:04:37.928984 kernel: vgaarb: loaded Jan 23 01:04:37.928993 kernel: clocksource: Switched to clocksource kvm-clock Jan 23 01:04:37.929000 kernel: VFS: Disk quotas dquot_6.6.0 Jan 23 01:04:37.929008 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 23 01:04:37.929016 kernel: pnp: PnP ACPI init Jan 23 01:04:37.929096 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 23 01:04:37.929110 kernel: pnp: PnP ACPI: found 5 devices Jan 23 01:04:37.929119 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 23 01:04:37.929127 kernel: NET: Registered PF_INET protocol family Jan 23 01:04:37.929135 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 23 01:04:37.929143 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 23 01:04:37.929151 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 23 01:04:37.929159 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 01:04:37.929167 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 23 01:04:37.929174 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 23 01:04:37.929185 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 23 01:04:37.929193 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 23 01:04:37.929201 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 23 01:04:37.929209 kernel: NET: Registered PF_XDP protocol family Jan 23 01:04:37.929287 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 23 01:04:37.929362 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 23 01:04:37.929434 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 23 01:04:37.929506 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 23 01:04:37.930465 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 23 01:04:37.930565 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 23 01:04:37.930641 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 23 01:04:37.930713 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 23 01:04:37.930785 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 23 01:04:37.930856 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 23 01:04:37.930929 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 23 01:04:37.931001 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 23 01:04:37.931075 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 23 01:04:37.931148 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 23 01:04:37.931219 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 23 01:04:37.931293 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 23 01:04:37.931366 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 23 01:04:37.931436 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 23 01:04:37.931507 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 23 01:04:37.931614 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 23 01:04:37.931691 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 23 01:04:37.931761 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 23 01:04:37.931833 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 23 01:04:37.931904 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 23 01:04:37.931975 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 23 01:04:37.932045 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 23 01:04:37.932120 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 23 01:04:37.932189 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 23 01:04:37.932263 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 23 01:04:37.932335 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 23 01:04:37.932405 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 23 01:04:37.932475 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 23 01:04:37.932864 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 23 01:04:37.932940 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 23 01:04:37.933011 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 23 01:04:37.933081 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 23 01:04:37.933154 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 23 01:04:37.933222 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 23 01:04:37.933291 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 23 01:04:37.933360 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 23 01:04:37.933430 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 23 01:04:37.933499 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 23 01:04:37.933581 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.933650 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.933722 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.933791 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.933860 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.934403 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.934482 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.934575 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.934646 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.934720 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.934792 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.934862 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.934933 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935001 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.935070 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935140 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.935211 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935282 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.935351 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935419 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.935489 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935571 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.935640 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935709 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.935778 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935850 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.935919 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.935988 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.936057 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.936126 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.936194 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 23 01:04:37.936263 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 23 01:04:37.936331 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 23 01:04:37.936401 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 23 01:04:37.936470 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 23 01:04:37.936556 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 23 01:04:37.936626 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 23 01:04:37.936694 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 23 01:04:37.936762 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 23 01:04:37.936830 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 23 01:04:37.936899 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 23 01:04:37.936969 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 23 01:04:37.937038 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 23 01:04:37.937105 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.937173 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.937242 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.937310 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.937379 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.937448 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.937537 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.937621 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.937690 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.937763 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.937833 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.937901 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.937970 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.938053 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.938127 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.938195 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.938264 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.938332 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.938401 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.938469 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.938789 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.938862 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.938935 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.939003 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.939072 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.939140 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.939209 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.939277 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.939666 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 01:04:37.939748 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 23 01:04:37.939829 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 23 01:04:37.939902 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 23 01:04:37.940000 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 23 01:04:37.940073 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 01:04:37.940172 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 23 01:04:37.940243 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 23 01:04:37.940312 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 23 01:04:37.940381 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 01:04:37.940456 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 23 01:04:37.940545 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 23 01:04:37.940614 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 23 01:04:37.940683 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 23 01:04:37.940753 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 23 01:04:37.940822 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 23 01:04:37.940891 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 23 01:04:37.940960 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 23 01:04:37.941030 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 23 01:04:37.941099 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 23 01:04:37.941169 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 23 01:04:37.941240 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 23 01:04:37.941310 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 23 01:04:37.941384 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 23 01:04:37.941453 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 23 01:04:37.942612 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 23 01:04:37.942695 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 23 01:04:37.942770 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 23 01:04:37.942839 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 23 01:04:37.942909 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 23 01:04:37.942978 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 23 01:04:37.943048 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 23 01:04:37.943117 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 23 01:04:37.943185 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 23 01:04:37.943255 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 23 01:04:37.943323 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 23 01:04:37.943391 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 23 01:04:37.943463 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 23 01:04:37.943548 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 23 01:04:37.943617 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 23 01:04:37.943686 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 23 01:04:37.943755 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 23 01:04:37.943824 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 23 01:04:37.943893 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 23 01:04:37.943962 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 23 01:04:37.944031 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 23 01:04:37.944099 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 23 01:04:37.944171 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 23 01:04:37.944240 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 23 01:04:37.944309 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 23 01:04:37.944377 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 23 01:04:37.944446 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 23 01:04:37.945593 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 23 01:04:37.945692 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 23 01:04:37.945765 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 23 01:04:37.945836 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 23 01:04:37.945913 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 23 01:04:37.945982 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 23 01:04:37.946063 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 23 01:04:37.946132 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 23 01:04:37.946204 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 23 01:04:37.946275 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 23 01:04:37.946345 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 23 01:04:37.946417 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 23 01:04:37.946488 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 23 01:04:37.947113 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 23 01:04:37.947191 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 23 01:04:37.947261 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 23 01:04:37.947332 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 23 01:04:37.947402 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 23 01:04:37.947478 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 23 01:04:37.947564 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 23 01:04:37.947635 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 23 01:04:37.947705 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 23 01:04:37.947776 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 23 01:04:37.947845 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 23 01:04:37.947917 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 23 01:04:37.947987 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 23 01:04:37.948059 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 23 01:04:37.948129 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 23 01:04:37.948199 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 23 01:04:37.948269 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 23 01:04:37.948338 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 23 01:04:37.948407 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 23 01:04:37.948477 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 23 01:04:37.948753 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 23 01:04:37.948828 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 23 01:04:37.948898 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 23 01:04:37.948968 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 23 01:04:37.949037 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 23 01:04:37.949106 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 23 01:04:37.949176 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 23 01:04:37.949251 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 23 01:04:37.949321 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 23 01:04:37.949389 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 23 01:04:37.949459 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 23 01:04:37.949542 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 23 01:04:37.949611 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 23 01:04:37.949680 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 23 01:04:37.949749 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 23 01:04:37.949822 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 23 01:04:37.949891 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 23 01:04:37.949960 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 23 01:04:37.950039 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 23 01:04:37.950111 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 23 01:04:37.950180 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 23 01:04:37.950248 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 23 01:04:37.950317 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 23 01:04:37.950391 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 23 01:04:37.950458 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 23 01:04:37.950532 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 23 01:04:37.950595 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 23 01:04:37.950655 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 23 01:04:37.950716 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 23 01:04:37.950791 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 23 01:04:37.950859 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 23 01:04:37.950923 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 01:04:37.950993 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 23 01:04:37.951060 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 23 01:04:37.951127 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 23 01:04:37.951197 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 23 01:04:37.951265 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 23 01:04:37.951335 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 23 01:04:37.951399 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 23 01:04:37.951473 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 23 01:04:37.953076 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 23 01:04:37.953169 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 23 01:04:37.953235 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 23 01:04:37.953311 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 23 01:04:37.953376 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 23 01:04:37.953445 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 23 01:04:37.953539 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 23 01:04:37.953621 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 23 01:04:37.953687 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 23 01:04:37.953763 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 23 01:04:37.953827 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 23 01:04:37.953896 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 23 01:04:37.953961 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 23 01:04:37.954061 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 23 01:04:37.954128 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 23 01:04:37.954200 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 23 01:04:37.954265 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 23 01:04:37.954336 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 23 01:04:37.954401 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 23 01:04:37.954469 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 23 01:04:37.955203 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 23 01:04:37.955283 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 23 01:04:37.955348 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 23 01:04:37.955419 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 23 01:04:37.955484 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 23 01:04:37.955571 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 23 01:04:37.955641 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 23 01:04:37.955705 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 23 01:04:37.955773 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 23 01:04:37.955837 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 23 01:04:37.955902 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 23 01:04:37.955971 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 23 01:04:37.956035 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 23 01:04:37.956103 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 23 01:04:37.956173 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 23 01:04:37.956238 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 23 01:04:37.956302 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 23 01:04:37.956370 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 23 01:04:37.956434 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 23 01:04:37.956497 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 23 01:04:37.956586 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 23 01:04:37.956650 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 23 01:04:37.956714 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 23 01:04:37.956783 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 23 01:04:37.956847 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 23 01:04:37.956911 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 23 01:04:37.956978 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 23 01:04:37.957045 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 23 01:04:37.957108 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 23 01:04:37.957176 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 23 01:04:37.957241 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 23 01:04:37.957304 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 23 01:04:37.957375 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 23 01:04:37.957442 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 23 01:04:37.957507 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 23 01:04:37.957602 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 23 01:04:37.957666 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 23 01:04:37.957730 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 23 01:04:37.957798 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 23 01:04:37.957863 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 23 01:04:37.957930 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 23 01:04:37.957998 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 23 01:04:37.958072 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 23 01:04:37.958136 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 23 01:04:37.958147 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 23 01:04:37.958155 kernel: PCI: CLS 0 bytes, default 64 Jan 23 01:04:37.958163 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 23 01:04:37.958171 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 23 01:04:37.958182 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 23 01:04:37.958191 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 23 01:04:37.958199 kernel: Initialise system trusted keyrings Jan 23 01:04:37.958207 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 23 01:04:37.958215 kernel: Key type asymmetric registered Jan 23 01:04:37.958223 kernel: Asymmetric key parser 'x509' registered Jan 23 01:04:37.958230 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 23 01:04:37.958239 kernel: io scheduler mq-deadline registered Jan 23 01:04:37.958249 kernel: io scheduler kyber registered Jan 23 01:04:37.958256 kernel: io scheduler bfq registered Jan 23 01:04:37.958330 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 23 01:04:37.958403 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 23 01:04:37.958475 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 23 01:04:37.958752 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 23 01:04:37.958829 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 23 01:04:37.958903 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 23 01:04:37.958974 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 23 01:04:37.959044 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 23 01:04:37.959114 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 23 01:04:37.959184 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 23 01:04:37.959253 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 23 01:04:37.959325 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 23 01:04:37.959393 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 23 01:04:37.959463 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 23 01:04:37.959544 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 23 01:04:37.959614 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 23 01:04:37.959625 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 23 01:04:37.959696 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 23 01:04:37.959764 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 23 01:04:37.959832 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 23 01:04:37.959900 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 23 01:04:37.959969 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 23 01:04:37.960037 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 23 01:04:37.960109 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 23 01:04:37.960178 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 23 01:04:37.960247 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 23 01:04:37.960316 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 23 01:04:37.960384 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 23 01:04:37.960456 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 23 01:04:37.960539 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 23 01:04:37.960609 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 23 01:04:37.960677 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 23 01:04:37.960745 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 23 01:04:37.960755 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 23 01:04:37.960822 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 23 01:04:37.960891 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 23 01:04:37.960964 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 23 01:04:37.961032 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 23 01:04:37.961100 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 23 01:04:37.961170 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 23 01:04:37.961239 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 23 01:04:37.961308 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 23 01:04:37.961377 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 23 01:04:37.961446 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 23 01:04:37.961551 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 23 01:04:37.961623 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 23 01:04:37.961692 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 23 01:04:37.961761 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 23 01:04:37.961830 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 23 01:04:37.961900 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 23 01:04:37.961910 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 23 01:04:37.961977 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 23 01:04:37.962064 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 23 01:04:37.962134 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 23 01:04:37.962204 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 23 01:04:37.962273 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 23 01:04:37.962342 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 23 01:04:37.962410 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 23 01:04:37.962479 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 23 01:04:37.962598 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 23 01:04:37.962669 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 23 01:04:37.962682 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 23 01:04:37.962691 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 23 01:04:37.962699 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 23 01:04:37.962707 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 23 01:04:37.962715 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 23 01:04:37.962723 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 23 01:04:37.962800 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 23 01:04:37.962811 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 23 01:04:37.962877 kernel: rtc_cmos 00:03: registered as rtc0 Jan 23 01:04:37.962941 kernel: rtc_cmos 00:03: setting system clock to 2026-01-23T01:04:37 UTC (1769130277) Jan 23 01:04:37.963005 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 23 01:04:37.963015 kernel: intel_pstate: CPU model not supported Jan 23 01:04:37.963023 kernel: efifb: probing for efifb Jan 23 01:04:37.963031 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 23 01:04:37.963039 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 23 01:04:37.963047 kernel: efifb: scrolling: redraw Jan 23 01:04:37.963058 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 23 01:04:37.963065 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 01:04:37.963073 kernel: fb0: EFI VGA frame buffer device Jan 23 01:04:37.963081 kernel: pstore: Using crash dump compression: deflate Jan 23 01:04:37.963089 kernel: pstore: Registered efi_pstore as persistent store backend Jan 23 01:04:37.963097 kernel: NET: Registered PF_INET6 protocol family Jan 23 01:04:37.963105 kernel: Segment Routing with IPv6 Jan 23 01:04:37.963113 kernel: In-situ OAM (IOAM) with IPv6 Jan 23 01:04:37.963121 kernel: NET: Registered PF_PACKET protocol family Jan 23 01:04:37.963129 kernel: Key type dns_resolver registered Jan 23 01:04:37.963139 kernel: IPI shorthand broadcast: enabled Jan 23 01:04:37.963147 kernel: sched_clock: Marking stable (3911005600, 155352232)->(4178677673, -112319841) Jan 23 01:04:37.963155 kernel: registered taskstats version 1 Jan 23 01:04:37.963163 kernel: Loading compiled-in X.509 certificates Jan 23 01:04:37.963171 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: ed54f39d0282729985c39b8ffa9938cacff38d8a' Jan 23 01:04:37.963179 kernel: Demotion targets for Node 0: null Jan 23 01:04:37.963187 kernel: Key type .fscrypt registered Jan 23 01:04:37.963195 kernel: Key type fscrypt-provisioning registered Jan 23 01:04:37.963202 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 23 01:04:37.963212 kernel: ima: Allocated hash algorithm: sha1 Jan 23 01:04:37.963220 kernel: ima: No architecture policies found Jan 23 01:04:37.963228 kernel: clk: Disabling unused clocks Jan 23 01:04:37.963235 kernel: Warning: unable to open an initial console. Jan 23 01:04:37.963244 kernel: Freeing unused kernel image (initmem) memory: 46196K Jan 23 01:04:37.963251 kernel: Write protecting the kernel read-only data: 40960k Jan 23 01:04:37.963259 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Jan 23 01:04:37.963267 kernel: Run /init as init process Jan 23 01:04:37.963275 kernel: with arguments: Jan 23 01:04:37.963287 kernel: /init Jan 23 01:04:37.963294 kernel: with environment: Jan 23 01:04:37.963302 kernel: HOME=/ Jan 23 01:04:37.963310 kernel: TERM=linux Jan 23 01:04:37.963319 systemd[1]: Successfully made /usr/ read-only. Jan 23 01:04:37.963331 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 01:04:37.963340 systemd[1]: Detected virtualization kvm. Jan 23 01:04:37.963350 systemd[1]: Detected architecture x86-64. Jan 23 01:04:37.963358 systemd[1]: Running in initrd. Jan 23 01:04:37.963367 systemd[1]: No hostname configured, using default hostname. Jan 23 01:04:37.963375 systemd[1]: Hostname set to . Jan 23 01:04:37.963383 systemd[1]: Initializing machine ID from VM UUID. Jan 23 01:04:37.963403 systemd[1]: Queued start job for default target initrd.target. Jan 23 01:04:37.963413 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 01:04:37.963422 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 01:04:37.963431 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 23 01:04:37.963440 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 01:04:37.963448 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 23 01:04:37.963459 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 23 01:04:37.963469 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 23 01:04:37.963477 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 23 01:04:37.963486 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 01:04:37.963494 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 01:04:37.963502 systemd[1]: Reached target paths.target - Path Units. Jan 23 01:04:37.963521 systemd[1]: Reached target slices.target - Slice Units. Jan 23 01:04:37.963532 systemd[1]: Reached target swap.target - Swaps. Jan 23 01:04:37.963540 systemd[1]: Reached target timers.target - Timer Units. Jan 23 01:04:37.963548 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 01:04:37.963557 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 01:04:37.963565 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 23 01:04:37.963574 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 23 01:04:37.963582 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 01:04:37.963590 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 01:04:37.963600 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 01:04:37.963609 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 01:04:37.963617 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 23 01:04:37.963625 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 01:04:37.963634 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 23 01:04:37.963642 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 23 01:04:37.963651 systemd[1]: Starting systemd-fsck-usr.service... Jan 23 01:04:37.963660 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 01:04:37.963668 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 01:04:37.963679 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 01:04:37.963687 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 23 01:04:37.963696 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 01:04:37.963704 systemd[1]: Finished systemd-fsck-usr.service. Jan 23 01:04:37.963715 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 23 01:04:37.963724 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:37.963732 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 23 01:04:37.963743 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 01:04:37.963774 systemd-journald[222]: Collecting audit messages is disabled. Jan 23 01:04:37.963798 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 01:04:37.963807 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 23 01:04:37.963816 kernel: Bridge firewalling registered Jan 23 01:04:37.963824 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 01:04:37.963833 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 01:04:37.963841 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 01:04:37.963852 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 01:04:37.963861 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 23 01:04:37.963869 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 01:04:37.963879 systemd-journald[222]: Journal started Jan 23 01:04:37.963903 systemd-journald[222]: Runtime Journal (/run/log/journal/bccfd3fbb5814349a64d2666a81a7fc0) is 8M, max 78M, 70M free. Jan 23 01:04:37.965547 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 01:04:37.890760 systemd-modules-load[225]: Inserted module 'overlay' Jan 23 01:04:37.921270 systemd-modules-load[225]: Inserted module 'br_netfilter' Jan 23 01:04:37.967193 dracut-cmdline[250]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=e8d7116310bea9a494780b8becdce41e7cc03ed509d8e2363e08981a47b3edc6 Jan 23 01:04:37.973641 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 01:04:37.986566 systemd-tmpfiles[285]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 23 01:04:37.990654 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 01:04:37.994617 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 01:04:38.028864 systemd-resolved[316]: Positive Trust Anchors: Jan 23 01:04:38.029601 systemd-resolved[316]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 01:04:38.029635 systemd-resolved[316]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 01:04:38.034371 systemd-resolved[316]: Defaulting to hostname 'linux'. Jan 23 01:04:38.035250 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 01:04:38.036911 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 01:04:38.043545 kernel: SCSI subsystem initialized Jan 23 01:04:38.054530 kernel: Loading iSCSI transport class v2.0-870. Jan 23 01:04:38.064531 kernel: iscsi: registered transport (tcp) Jan 23 01:04:38.086699 kernel: iscsi: registered transport (qla4xxx) Jan 23 01:04:38.086762 kernel: QLogic iSCSI HBA Driver Jan 23 01:04:38.105657 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 01:04:38.123008 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 01:04:38.125202 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 01:04:38.173831 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 23 01:04:38.176383 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 23 01:04:38.235559 kernel: raid6: avx512x4 gen() 34708 MB/s Jan 23 01:04:38.252543 kernel: raid6: avx512x2 gen() 45936 MB/s Jan 23 01:04:38.269530 kernel: raid6: avx512x1 gen() 44374 MB/s Jan 23 01:04:38.286554 kernel: raid6: avx2x4 gen() 34322 MB/s Jan 23 01:04:38.303556 kernel: raid6: avx2x2 gen() 34300 MB/s Jan 23 01:04:38.320859 kernel: raid6: avx2x1 gen() 26742 MB/s Jan 23 01:04:38.320965 kernel: raid6: using algorithm avx512x2 gen() 45936 MB/s Jan 23 01:04:38.338953 kernel: raid6: .... xor() 26743 MB/s, rmw enabled Jan 23 01:04:38.339056 kernel: raid6: using avx512x2 recovery algorithm Jan 23 01:04:38.388603 kernel: xor: automatically using best checksumming function avx Jan 23 01:04:38.546617 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 23 01:04:38.561279 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 23 01:04:38.563659 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 01:04:38.622072 systemd-udevd[473]: Using default interface naming scheme 'v255'. Jan 23 01:04:38.630879 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 01:04:38.637791 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 23 01:04:38.667326 dracut-pre-trigger[484]: rd.md=0: removing MD RAID activation Jan 23 01:04:38.713864 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 01:04:38.717652 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 01:04:38.818834 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 01:04:38.826927 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 23 01:04:38.901529 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 23 01:04:38.913533 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 23 01:04:38.931564 kernel: cryptd: max_cpu_qlen set to 1000 Jan 23 01:04:38.942606 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 23 01:04:38.942641 kernel: GPT:17805311 != 104857599 Jan 23 01:04:38.942652 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 23 01:04:38.942662 kernel: GPT:17805311 != 104857599 Jan 23 01:04:38.942676 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 23 01:04:38.942686 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 01:04:38.947551 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 01:04:38.947702 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:38.950155 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 01:04:38.951918 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 01:04:38.959628 kernel: AES CTR mode by8 optimization enabled Jan 23 01:04:38.959654 kernel: ACPI: bus type USB registered Jan 23 01:04:38.959666 kernel: usbcore: registered new interface driver usbfs Jan 23 01:04:38.959678 kernel: usbcore: registered new interface driver hub Jan 23 01:04:38.956924 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jan 23 01:04:38.965460 kernel: usbcore: registered new device driver usb Jan 23 01:04:38.981999 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jan 23 01:04:38.984564 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 23 01:04:38.987567 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 23 01:04:38.987733 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 23 01:04:38.997521 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 23 01:04:38.997693 kernel: hub 1-0:1.0: USB hub found Jan 23 01:04:38.997821 kernel: hub 1-0:1.0: 2 ports detected Jan 23 01:04:39.009816 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:39.010523 kernel: libata version 3.00 loaded. Jan 23 01:04:39.021069 kernel: ahci 0000:00:1f.2: version 3.0 Jan 23 01:04:39.021235 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 23 01:04:39.021249 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 23 01:04:39.022780 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 23 01:04:39.023745 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 23 01:04:39.028540 kernel: scsi host0: ahci Jan 23 01:04:39.031861 kernel: scsi host1: ahci Jan 23 01:04:39.034368 kernel: scsi host2: ahci Jan 23 01:04:39.034549 kernel: scsi host3: ahci Jan 23 01:04:39.034645 kernel: scsi host4: ahci Jan 23 01:04:39.036746 kernel: scsi host5: ahci Jan 23 01:04:39.036875 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 61 lpm-pol 1 Jan 23 01:04:39.038961 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 61 lpm-pol 1 Jan 23 01:04:39.038983 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 61 lpm-pol 1 Jan 23 01:04:39.041903 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 61 lpm-pol 1 Jan 23 01:04:39.041928 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 61 lpm-pol 1 Jan 23 01:04:39.044261 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 61 lpm-pol 1 Jan 23 01:04:39.076456 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 23 01:04:39.089815 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 23 01:04:39.101561 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 23 01:04:39.102833 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 23 01:04:39.110713 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 01:04:39.112320 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 23 01:04:39.142568 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 01:04:39.144643 disk-uuid[663]: Primary Header is updated. Jan 23 01:04:39.144643 disk-uuid[663]: Secondary Entries is updated. Jan 23 01:04:39.144643 disk-uuid[663]: Secondary Header is updated. Jan 23 01:04:39.224542 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 23 01:04:39.362883 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 23 01:04:39.362985 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 23 01:04:39.363010 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 23 01:04:39.363033 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 23 01:04:39.364852 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 23 01:04:39.367090 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 23 01:04:39.385621 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 23 01:04:39.387221 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 01:04:39.387821 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 01:04:39.388909 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 01:04:39.391153 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 23 01:04:39.421880 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 23 01:04:39.425543 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 23 01:04:39.435060 kernel: usbcore: registered new interface driver usbhid Jan 23 01:04:39.435097 kernel: usbhid: USB HID core driver Jan 23 01:04:39.441776 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 23 01:04:39.441828 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 23 01:04:40.166084 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 01:04:40.166205 disk-uuid[664]: The operation has completed successfully. Jan 23 01:04:40.275713 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 23 01:04:40.276678 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 23 01:04:40.311651 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 23 01:04:40.346486 sh[698]: Success Jan 23 01:04:40.372202 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 23 01:04:40.372298 kernel: device-mapper: uevent: version 1.0.3 Jan 23 01:04:40.374876 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 23 01:04:40.385556 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Jan 23 01:04:40.460100 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 23 01:04:40.464612 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 23 01:04:40.466100 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 23 01:04:40.493637 kernel: BTRFS: device fsid f8eb2396-46b8-49a3-a8e7-cd8ad10a3ce4 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (710) Jan 23 01:04:40.496909 kernel: BTRFS info (device dm-0): first mount of filesystem f8eb2396-46b8-49a3-a8e7-cd8ad10a3ce4 Jan 23 01:04:40.496950 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 23 01:04:40.517824 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 23 01:04:40.517899 kernel: BTRFS info (device dm-0): enabling free space tree Jan 23 01:04:40.520988 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 23 01:04:40.521894 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 23 01:04:40.522417 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 23 01:04:40.524624 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 23 01:04:40.525656 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 23 01:04:40.561534 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (741) Jan 23 01:04:40.564539 kernel: BTRFS info (device vda6): first mount of filesystem a3ccc207-e674-4ba2-b6d8-404b4581ba01 Jan 23 01:04:40.566534 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 01:04:40.574138 kernel: BTRFS info (device vda6): turning on async discard Jan 23 01:04:40.574168 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 01:04:40.579527 kernel: BTRFS info (device vda6): last unmount of filesystem a3ccc207-e674-4ba2-b6d8-404b4581ba01 Jan 23 01:04:40.580447 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 23 01:04:40.582742 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 23 01:04:40.625544 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 01:04:40.627643 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 01:04:40.661889 systemd-networkd[880]: lo: Link UP Jan 23 01:04:40.661897 systemd-networkd[880]: lo: Gained carrier Jan 23 01:04:40.662886 systemd-networkd[880]: Enumeration completed Jan 23 01:04:40.662964 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 01:04:40.663762 systemd-networkd[880]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 01:04:40.663766 systemd-networkd[880]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 01:04:40.664410 systemd[1]: Reached target network.target - Network. Jan 23 01:04:40.665529 systemd-networkd[880]: eth0: Link UP Jan 23 01:04:40.665623 systemd-networkd[880]: eth0: Gained carrier Jan 23 01:04:40.665632 systemd-networkd[880]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 01:04:40.681820 systemd-networkd[880]: eth0: DHCPv4 address 10.0.5.114/25, gateway 10.0.5.1 acquired from 10.0.5.1 Jan 23 01:04:40.736203 ignition[818]: Ignition 2.22.0 Jan 23 01:04:40.736217 ignition[818]: Stage: fetch-offline Jan 23 01:04:40.736243 ignition[818]: no configs at "/usr/lib/ignition/base.d" Jan 23 01:04:40.736251 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 01:04:40.739121 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 01:04:40.736319 ignition[818]: parsed url from cmdline: "" Jan 23 01:04:40.740603 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 23 01:04:40.736323 ignition[818]: no config URL provided Jan 23 01:04:40.736327 ignition[818]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 01:04:40.736333 ignition[818]: no config at "/usr/lib/ignition/user.ign" Jan 23 01:04:40.736338 ignition[818]: failed to fetch config: resource requires networking Jan 23 01:04:40.736611 ignition[818]: Ignition finished successfully Jan 23 01:04:40.773379 ignition[891]: Ignition 2.22.0 Jan 23 01:04:40.773391 ignition[891]: Stage: fetch Jan 23 01:04:40.773936 ignition[891]: no configs at "/usr/lib/ignition/base.d" Jan 23 01:04:40.773947 ignition[891]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 01:04:40.774035 ignition[891]: parsed url from cmdline: "" Jan 23 01:04:40.774038 ignition[891]: no config URL provided Jan 23 01:04:40.774043 ignition[891]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 01:04:40.774051 ignition[891]: no config at "/usr/lib/ignition/user.ign" Jan 23 01:04:40.774155 ignition[891]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 23 01:04:40.774235 ignition[891]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 01:04:40.774258 ignition[891]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 01:04:41.774433 ignition[891]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 01:04:41.774528 ignition[891]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 01:04:42.084770 systemd-networkd[880]: eth0: Gained IPv6LL Jan 23 01:04:42.539040 ignition[891]: GET result: OK Jan 23 01:04:42.539243 ignition[891]: parsing config with SHA512: 2eeb3f80076b34a0f427ec508aef5f5c6a27358ca2d9a35d55c1c65f9cae4fece8cf90e763ab0b1629d7ea13b4e6720c5b7b7da53ae24db2982d66fe8bcdb33e Jan 23 01:04:42.549221 unknown[891]: fetched base config from "system" Jan 23 01:04:42.549245 unknown[891]: fetched base config from "system" Jan 23 01:04:42.551797 ignition[891]: fetch: fetch complete Jan 23 01:04:42.549260 unknown[891]: fetched user config from "openstack" Jan 23 01:04:42.551811 ignition[891]: fetch: fetch passed Jan 23 01:04:42.551919 ignition[891]: Ignition finished successfully Jan 23 01:04:42.558749 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 23 01:04:42.562759 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 23 01:04:42.617009 ignition[897]: Ignition 2.22.0 Jan 23 01:04:42.617589 ignition[897]: Stage: kargs Jan 23 01:04:42.617812 ignition[897]: no configs at "/usr/lib/ignition/base.d" Jan 23 01:04:42.617826 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 01:04:42.619291 ignition[897]: kargs: kargs passed Jan 23 01:04:42.619359 ignition[897]: Ignition finished successfully Jan 23 01:04:42.621892 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 23 01:04:42.624300 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 23 01:04:42.651766 ignition[904]: Ignition 2.22.0 Jan 23 01:04:42.651778 ignition[904]: Stage: disks Jan 23 01:04:42.651908 ignition[904]: no configs at "/usr/lib/ignition/base.d" Jan 23 01:04:42.651917 ignition[904]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 01:04:42.653028 ignition[904]: disks: disks passed Jan 23 01:04:42.653070 ignition[904]: Ignition finished successfully Jan 23 01:04:42.654592 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 23 01:04:42.655732 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 23 01:04:42.656330 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 23 01:04:42.656970 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 01:04:42.657572 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 01:04:42.658172 systemd[1]: Reached target basic.target - Basic System. Jan 23 01:04:42.659636 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 23 01:04:42.706416 systemd-fsck[913]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jan 23 01:04:42.708842 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 23 01:04:42.710662 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 23 01:04:42.892571 kernel: EXT4-fs (vda9): mounted filesystem 2036722e-4586-420e-8dc7-a3b65e840c36 r/w with ordered data mode. Quota mode: none. Jan 23 01:04:42.894873 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 23 01:04:42.897070 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 23 01:04:42.902576 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 01:04:42.906714 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 23 01:04:42.909824 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 23 01:04:42.914803 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 23 01:04:42.916256 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 23 01:04:42.917497 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 01:04:42.931760 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 23 01:04:42.934822 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 23 01:04:42.948588 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (921) Jan 23 01:04:42.953970 kernel: BTRFS info (device vda6): first mount of filesystem a3ccc207-e674-4ba2-b6d8-404b4581ba01 Jan 23 01:04:42.954015 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 01:04:42.967908 kernel: BTRFS info (device vda6): turning on async discard Jan 23 01:04:42.967954 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 01:04:42.975184 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 01:04:43.017300 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:43.049413 initrd-setup-root[951]: cut: /sysroot/etc/passwd: No such file or directory Jan 23 01:04:43.060327 initrd-setup-root[958]: cut: /sysroot/etc/group: No such file or directory Jan 23 01:04:43.069087 initrd-setup-root[965]: cut: /sysroot/etc/shadow: No such file or directory Jan 23 01:04:43.075076 initrd-setup-root[972]: cut: /sysroot/etc/gshadow: No such file or directory Jan 23 01:04:43.222963 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 23 01:04:43.227153 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 23 01:04:43.229592 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 23 01:04:43.245982 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 23 01:04:43.248099 kernel: BTRFS info (device vda6): last unmount of filesystem a3ccc207-e674-4ba2-b6d8-404b4581ba01 Jan 23 01:04:43.278885 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 23 01:04:43.285537 ignition[1039]: INFO : Ignition 2.22.0 Jan 23 01:04:43.285537 ignition[1039]: INFO : Stage: mount Jan 23 01:04:43.285537 ignition[1039]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 01:04:43.285537 ignition[1039]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 01:04:43.288727 ignition[1039]: INFO : mount: mount passed Jan 23 01:04:43.289428 ignition[1039]: INFO : Ignition finished successfully Jan 23 01:04:43.290744 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 23 01:04:44.066625 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:46.080567 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:50.095147 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:50.105451 coreos-metadata[923]: Jan 23 01:04:50.105 WARN failed to locate config-drive, using the metadata service API instead Jan 23 01:04:50.125458 coreos-metadata[923]: Jan 23 01:04:50.125 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 01:04:50.744237 coreos-metadata[923]: Jan 23 01:04:50.744 INFO Fetch successful Jan 23 01:04:50.746015 coreos-metadata[923]: Jan 23 01:04:50.745 INFO wrote hostname ci-4459-2-2-n-615049e46b to /sysroot/etc/hostname Jan 23 01:04:50.746908 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 23 01:04:50.747029 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 23 01:04:50.750609 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 23 01:04:50.773895 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 01:04:50.818570 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1056) Jan 23 01:04:50.825545 kernel: BTRFS info (device vda6): first mount of filesystem a3ccc207-e674-4ba2-b6d8-404b4581ba01 Jan 23 01:04:50.825652 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 01:04:50.837353 kernel: BTRFS info (device vda6): turning on async discard Jan 23 01:04:50.837428 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 01:04:50.842377 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 01:04:50.910261 ignition[1074]: INFO : Ignition 2.22.0 Jan 23 01:04:50.911504 ignition[1074]: INFO : Stage: files Jan 23 01:04:50.912615 ignition[1074]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 01:04:50.914582 ignition[1074]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 01:04:50.914582 ignition[1074]: DEBUG : files: compiled without relabeling support, skipping Jan 23 01:04:50.917486 ignition[1074]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 23 01:04:50.918307 ignition[1074]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 23 01:04:50.924075 ignition[1074]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 23 01:04:50.924894 ignition[1074]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 23 01:04:50.926041 unknown[1074]: wrote ssh authorized keys file for user: core Jan 23 01:04:50.926952 ignition[1074]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 23 01:04:50.933444 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 23 01:04:50.933444 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jan 23 01:04:50.990775 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 23 01:04:51.106957 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 01:04:51.108072 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 01:04:51.111196 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 01:04:51.111196 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 01:04:51.111196 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 23 01:04:51.112678 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 23 01:04:51.112678 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 23 01:04:51.112678 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Jan 23 01:04:51.364897 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 23 01:04:51.935591 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 23 01:04:51.937504 ignition[1074]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 23 01:04:51.937504 ignition[1074]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 01:04:51.941053 ignition[1074]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 01:04:51.941053 ignition[1074]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 23 01:04:51.941053 ignition[1074]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 23 01:04:51.944993 ignition[1074]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 23 01:04:51.944993 ignition[1074]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 23 01:04:51.944993 ignition[1074]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 23 01:04:51.944993 ignition[1074]: INFO : files: files passed Jan 23 01:04:51.944993 ignition[1074]: INFO : Ignition finished successfully Jan 23 01:04:51.946581 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 23 01:04:51.951089 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 23 01:04:51.953637 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 23 01:04:51.970827 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 23 01:04:51.970970 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 23 01:04:51.980059 initrd-setup-root-after-ignition[1104]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 01:04:51.980059 initrd-setup-root-after-ignition[1104]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 23 01:04:51.983232 initrd-setup-root-after-ignition[1108]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 01:04:51.985699 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 01:04:51.986780 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 23 01:04:51.988844 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 23 01:04:52.041960 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 23 01:04:52.042155 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 23 01:04:52.043708 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 23 01:04:52.044666 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 23 01:04:52.046024 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 23 01:04:52.047211 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 23 01:04:52.067625 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 01:04:52.071050 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 23 01:04:52.100129 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 23 01:04:52.101818 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 01:04:52.103336 systemd[1]: Stopped target timers.target - Timer Units. Jan 23 01:04:52.104742 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 23 01:04:52.104937 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 01:04:52.106589 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 23 01:04:52.107531 systemd[1]: Stopped target basic.target - Basic System. Jan 23 01:04:52.108659 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 23 01:04:52.109834 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 01:04:52.110976 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 23 01:04:52.112148 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 23 01:04:52.113434 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 23 01:04:52.114830 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 01:04:52.116127 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 23 01:04:52.117418 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 23 01:04:52.118563 systemd[1]: Stopped target swap.target - Swaps. Jan 23 01:04:52.119727 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 23 01:04:52.119939 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 23 01:04:52.121559 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 23 01:04:52.122960 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 01:04:52.124090 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 23 01:04:52.124263 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 01:04:52.125419 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 23 01:04:52.125598 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 23 01:04:52.127324 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 23 01:04:52.127496 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 01:04:52.128722 systemd[1]: ignition-files.service: Deactivated successfully. Jan 23 01:04:52.128875 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 23 01:04:52.131259 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 23 01:04:52.135775 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 23 01:04:52.136502 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 23 01:04:52.136687 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 01:04:52.141415 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 23 01:04:52.142131 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 01:04:52.146275 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 23 01:04:52.147648 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 23 01:04:52.171149 ignition[1128]: INFO : Ignition 2.22.0 Jan 23 01:04:52.171149 ignition[1128]: INFO : Stage: umount Jan 23 01:04:52.174917 ignition[1128]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 01:04:52.174917 ignition[1128]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 01:04:52.174917 ignition[1128]: INFO : umount: umount passed Jan 23 01:04:52.174917 ignition[1128]: INFO : Ignition finished successfully Jan 23 01:04:52.173858 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 23 01:04:52.175860 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 23 01:04:52.175969 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 23 01:04:52.176764 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 23 01:04:52.176804 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 23 01:04:52.177859 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 23 01:04:52.177908 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 23 01:04:52.178369 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 23 01:04:52.178403 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 23 01:04:52.180062 systemd[1]: Stopped target network.target - Network. Jan 23 01:04:52.181606 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 23 01:04:52.181671 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 01:04:52.182260 systemd[1]: Stopped target paths.target - Path Units. Jan 23 01:04:52.182781 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 23 01:04:52.186576 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 01:04:52.187260 systemd[1]: Stopped target slices.target - Slice Units. Jan 23 01:04:52.188050 systemd[1]: Stopped target sockets.target - Socket Units. Jan 23 01:04:52.188928 systemd[1]: iscsid.socket: Deactivated successfully. Jan 23 01:04:52.188972 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 01:04:52.189717 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 23 01:04:52.189756 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 01:04:52.190488 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 23 01:04:52.190564 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 23 01:04:52.191290 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 23 01:04:52.191335 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 23 01:04:52.192199 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 23 01:04:52.192969 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 23 01:04:52.194918 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 23 01:04:52.195026 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 23 01:04:52.196339 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 23 01:04:52.196459 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 23 01:04:52.201238 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 23 01:04:52.201371 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 23 01:04:52.205235 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jan 23 01:04:52.205483 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 23 01:04:52.205663 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 23 01:04:52.207677 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jan 23 01:04:52.208322 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 23 01:04:52.208950 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 23 01:04:52.209000 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 23 01:04:52.210872 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 23 01:04:52.211372 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 23 01:04:52.211428 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 01:04:52.212006 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 23 01:04:52.212049 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 23 01:04:52.213630 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 23 01:04:52.213673 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 23 01:04:52.214225 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 23 01:04:52.214265 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 01:04:52.216631 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 01:04:52.218194 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jan 23 01:04:52.218252 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jan 23 01:04:52.225761 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 23 01:04:52.226373 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 01:04:52.227812 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 23 01:04:52.228289 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 23 01:04:52.228796 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 23 01:04:52.228824 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 01:04:52.229202 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 23 01:04:52.229239 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 23 01:04:52.229717 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 23 01:04:52.229750 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 23 01:04:52.230176 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 23 01:04:52.230212 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 01:04:52.233212 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 23 01:04:52.234223 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 23 01:04:52.234267 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 01:04:52.237149 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 23 01:04:52.237187 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 01:04:52.238639 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 23 01:04:52.238675 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 01:04:52.239204 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 23 01:04:52.239236 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 01:04:52.239976 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 01:04:52.240010 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:52.242732 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jan 23 01:04:52.242777 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Jan 23 01:04:52.242806 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jan 23 01:04:52.242837 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jan 23 01:04:52.243133 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 23 01:04:52.244821 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 23 01:04:52.245811 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 23 01:04:52.246292 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 23 01:04:52.248077 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 23 01:04:52.249191 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 23 01:04:52.269893 systemd[1]: Switching root. Jan 23 01:04:52.320096 systemd-journald[222]: Journal stopped Jan 23 01:04:54.387304 systemd-journald[222]: Received SIGTERM from PID 1 (systemd). Jan 23 01:04:54.387396 kernel: SELinux: policy capability network_peer_controls=1 Jan 23 01:04:54.387411 kernel: SELinux: policy capability open_perms=1 Jan 23 01:04:54.387421 kernel: SELinux: policy capability extended_socket_class=1 Jan 23 01:04:54.387434 kernel: SELinux: policy capability always_check_network=0 Jan 23 01:04:54.387447 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 23 01:04:54.387461 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 23 01:04:54.387471 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 23 01:04:54.387484 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 23 01:04:54.387496 kernel: SELinux: policy capability userspace_initial_context=0 Jan 23 01:04:54.387506 kernel: audit: type=1403 audit(1769130293.254:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 23 01:04:54.389534 systemd[1]: Successfully loaded SELinux policy in 105.799ms. Jan 23 01:04:54.389561 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.566ms. Jan 23 01:04:54.389573 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 01:04:54.389585 systemd[1]: Detected virtualization kvm. Jan 23 01:04:54.389595 systemd[1]: Detected architecture x86-64. Jan 23 01:04:54.389608 systemd[1]: Detected first boot. Jan 23 01:04:54.389618 systemd[1]: Hostname set to . Jan 23 01:04:54.389628 systemd[1]: Initializing machine ID from VM UUID. Jan 23 01:04:54.389639 zram_generator::config[1171]: No configuration found. Jan 23 01:04:54.389650 kernel: Guest personality initialized and is inactive Jan 23 01:04:54.389660 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 23 01:04:54.389669 kernel: Initialized host personality Jan 23 01:04:54.389679 kernel: NET: Registered PF_VSOCK protocol family Jan 23 01:04:54.389691 systemd[1]: Populated /etc with preset unit settings. Jan 23 01:04:54.389702 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jan 23 01:04:54.389712 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 23 01:04:54.389722 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 23 01:04:54.389733 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 23 01:04:54.389747 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 23 01:04:54.389757 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 23 01:04:54.389767 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 23 01:04:54.389777 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 23 01:04:54.389789 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 23 01:04:54.389799 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 23 01:04:54.389809 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 23 01:04:54.389820 systemd[1]: Created slice user.slice - User and Session Slice. Jan 23 01:04:54.389831 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 01:04:54.389841 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 01:04:54.389852 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 23 01:04:54.389863 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 23 01:04:54.389874 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 23 01:04:54.389884 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 01:04:54.389894 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 23 01:04:54.389904 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 01:04:54.389916 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 01:04:54.389926 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 23 01:04:54.389936 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 23 01:04:54.389948 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 23 01:04:54.389969 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 23 01:04:54.389980 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 01:04:54.389996 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 01:04:54.390007 systemd[1]: Reached target slices.target - Slice Units. Jan 23 01:04:54.390017 systemd[1]: Reached target swap.target - Swaps. Jan 23 01:04:54.390027 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 23 01:04:54.390037 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 23 01:04:54.390047 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 23 01:04:54.390061 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 01:04:54.390071 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 01:04:54.390082 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 01:04:54.390093 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 23 01:04:54.390106 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 23 01:04:54.390116 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 23 01:04:54.390126 systemd[1]: Mounting media.mount - External Media Directory... Jan 23 01:04:54.390137 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 01:04:54.390148 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 23 01:04:54.390160 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 23 01:04:54.390170 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 23 01:04:54.390181 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 23 01:04:54.390191 systemd[1]: Reached target machines.target - Containers. Jan 23 01:04:54.390201 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 23 01:04:54.390211 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 01:04:54.390222 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 01:04:54.390232 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 23 01:04:54.390243 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 01:04:54.390253 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 01:04:54.390264 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 01:04:54.390275 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 23 01:04:54.390286 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 01:04:54.390296 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 23 01:04:54.390306 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 23 01:04:54.390319 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 23 01:04:54.390329 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 23 01:04:54.390340 systemd[1]: Stopped systemd-fsck-usr.service. Jan 23 01:04:54.390354 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 01:04:54.390364 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 01:04:54.390376 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 01:04:54.390387 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 01:04:54.390396 kernel: fuse: init (API version 7.41) Jan 23 01:04:54.390406 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 23 01:04:54.390416 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 23 01:04:54.390426 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 01:04:54.390436 kernel: loop: module loaded Jan 23 01:04:54.390448 systemd[1]: verity-setup.service: Deactivated successfully. Jan 23 01:04:54.390458 systemd[1]: Stopped verity-setup.service. Jan 23 01:04:54.390469 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 01:04:54.390479 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 23 01:04:54.390490 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 23 01:04:54.390500 systemd[1]: Mounted media.mount - External Media Directory. Jan 23 01:04:54.390519 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 23 01:04:54.390529 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 23 01:04:54.390539 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 23 01:04:54.390551 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 01:04:54.390562 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 23 01:04:54.390572 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 23 01:04:54.390582 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 01:04:54.390614 systemd-journald[1245]: Collecting audit messages is disabled. Jan 23 01:04:54.390638 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 01:04:54.390650 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 01:04:54.390661 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 01:04:54.390673 systemd-journald[1245]: Journal started Jan 23 01:04:54.390695 systemd-journald[1245]: Runtime Journal (/run/log/journal/bccfd3fbb5814349a64d2666a81a7fc0) is 8M, max 78M, 70M free. Jan 23 01:04:54.109934 systemd[1]: Queued start job for default target multi-user.target. Jan 23 01:04:54.134347 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 23 01:04:54.134805 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 23 01:04:54.393545 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 01:04:54.394234 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 23 01:04:54.395566 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 23 01:04:54.396188 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 01:04:54.396317 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 01:04:54.397926 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 23 01:04:54.403161 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 01:04:54.410179 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 01:04:54.412907 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 23 01:04:54.416659 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 23 01:04:54.421148 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 23 01:04:54.421184 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 01:04:54.422497 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 23 01:04:54.430296 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 23 01:04:54.431954 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 01:04:54.433726 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 23 01:04:54.439681 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 23 01:04:54.440227 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 01:04:54.446712 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 23 01:04:54.447275 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 01:04:54.448810 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 23 01:04:54.450649 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 23 01:04:54.457744 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 01:04:54.458616 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 23 01:04:54.459271 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 23 01:04:54.459783 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 23 01:04:54.476545 kernel: ACPI: bus type drm_connector registered Jan 23 01:04:54.476650 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 01:04:54.482691 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 23 01:04:54.483976 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 01:04:54.484289 systemd-journald[1245]: Time spent on flushing to /var/log/journal/bccfd3fbb5814349a64d2666a81a7fc0 is 75.539ms for 1712 entries. Jan 23 01:04:54.484289 systemd-journald[1245]: System Journal (/var/log/journal/bccfd3fbb5814349a64d2666a81a7fc0) is 8M, max 584.8M, 576.8M free. Jan 23 01:04:54.586840 systemd-journald[1245]: Received client request to flush runtime journal. Jan 23 01:04:54.586895 kernel: loop0: detected capacity change from 0 to 128560 Jan 23 01:04:54.484548 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 01:04:54.495316 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 23 01:04:54.496742 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 23 01:04:54.498755 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 23 01:04:54.536084 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 01:04:54.538930 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Jan 23 01:04:54.538941 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Jan 23 01:04:54.543814 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 01:04:54.548758 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 23 01:04:54.564747 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 01:04:54.591066 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 23 01:04:54.604167 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 23 01:04:54.609618 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 23 01:04:54.623025 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 23 01:04:54.627622 kernel: loop1: detected capacity change from 0 to 1640 Jan 23 01:04:54.626620 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 01:04:54.650055 systemd-tmpfiles[1317]: ACLs are not supported, ignoring. Jan 23 01:04:54.650069 systemd-tmpfiles[1317]: ACLs are not supported, ignoring. Jan 23 01:04:54.652536 kernel: loop2: detected capacity change from 0 to 219144 Jan 23 01:04:54.657551 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 01:04:54.789592 kernel: loop3: detected capacity change from 0 to 110984 Jan 23 01:04:54.919538 kernel: loop4: detected capacity change from 0 to 128560 Jan 23 01:04:54.997546 kernel: loop5: detected capacity change from 0 to 1640 Jan 23 01:04:55.011931 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 23 01:04:55.013891 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 01:04:55.033783 kernel: loop6: detected capacity change from 0 to 219144 Jan 23 01:04:55.056338 systemd-udevd[1327]: Using default interface naming scheme 'v255'. Jan 23 01:04:55.137495 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 23 01:04:55.182578 kernel: loop7: detected capacity change from 0 to 110984 Jan 23 01:04:55.220227 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 01:04:55.227121 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 01:04:55.253184 (sd-merge)[1325]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Jan 23 01:04:55.271644 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 23 01:04:55.293538 (sd-merge)[1325]: Merged extensions into '/usr'. Jan 23 01:04:55.312190 systemd[1]: Reload requested from client PID 1291 ('systemd-sysext') (unit systemd-sysext.service)... Jan 23 01:04:55.312330 systemd[1]: Reloading... Jan 23 01:04:55.393531 zram_generator::config[1383]: No configuration found. Jan 23 01:04:55.533551 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 23 01:04:55.546421 systemd-networkd[1330]: lo: Link UP Jan 23 01:04:55.548004 systemd-networkd[1330]: lo: Gained carrier Jan 23 01:04:55.549160 systemd-networkd[1330]: Enumeration completed Jan 23 01:04:55.550864 systemd-networkd[1330]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 01:04:55.550941 systemd-networkd[1330]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 01:04:55.551665 systemd-networkd[1330]: eth0: Link UP Jan 23 01:04:55.552003 systemd-networkd[1330]: eth0: Gained carrier Jan 23 01:04:55.552237 systemd-networkd[1330]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 01:04:55.554525 kernel: ACPI: button: Power Button [PWRF] Jan 23 01:04:55.563529 kernel: mousedev: PS/2 mouse device common for all mice Jan 23 01:04:55.566593 systemd-networkd[1330]: eth0: DHCPv4 address 10.0.5.114/25, gateway 10.0.5.1 acquired from 10.0.5.1 Jan 23 01:04:55.677471 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 23 01:04:55.677741 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 23 01:04:55.681626 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 23 01:04:55.684912 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 23 01:04:55.685320 systemd[1]: Reloading finished in 372 ms. Jan 23 01:04:55.701158 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 23 01:04:55.702368 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 01:04:55.704114 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 23 01:04:55.738003 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 01:04:55.746619 systemd[1]: Starting ensure-sysext.service... Jan 23 01:04:55.751624 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 23 01:04:55.754654 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 23 01:04:55.760667 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 23 01:04:55.764166 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 01:04:55.788487 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 01:04:55.815852 systemd[1]: Reload requested from client PID 1455 ('systemctl') (unit ensure-sysext.service)... Jan 23 01:04:55.815868 systemd[1]: Reloading... Jan 23 01:04:55.876211 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 23 01:04:55.877996 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 23 01:04:55.878243 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 23 01:04:55.878448 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 23 01:04:55.881058 systemd-tmpfiles[1459]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 23 01:04:55.881260 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Jan 23 01:04:55.881306 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Jan 23 01:04:55.897069 systemd-tmpfiles[1459]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 01:04:55.897081 systemd-tmpfiles[1459]: Skipping /boot Jan 23 01:04:55.927839 systemd-tmpfiles[1459]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 01:04:55.927851 systemd-tmpfiles[1459]: Skipping /boot Jan 23 01:04:55.942544 zram_generator::config[1497]: No configuration found. Jan 23 01:04:56.034605 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 23 01:04:56.037048 kernel: Console: switching to colour dummy device 80x25 Jan 23 01:04:56.038576 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 23 01:04:56.039675 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 23 01:04:56.039696 kernel: [drm] features: -context_init Jan 23 01:04:56.039711 ldconfig[1282]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 23 01:04:56.042528 kernel: [drm] number of scanouts: 1 Jan 23 01:04:56.042566 kernel: [drm] number of cap sets: 0 Jan 23 01:04:56.042582 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 23 01:04:56.046532 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 23 01:04:56.049291 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 01:04:56.054529 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 23 01:04:56.188294 systemd[1]: Reloading finished in 372 ms. Jan 23 01:04:56.201994 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 23 01:04:56.208564 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 23 01:04:56.213732 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 23 01:04:56.214075 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 01:04:56.214404 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:56.234726 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 01:04:56.238699 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 01:04:56.241920 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 23 01:04:56.242114 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 01:04:56.243062 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 01:04:56.246421 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 01:04:56.252427 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 01:04:56.254286 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 01:04:56.254395 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 01:04:56.256671 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 23 01:04:56.258689 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 01:04:56.261374 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 23 01:04:56.261915 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 01:04:56.262039 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:56.262322 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 01:04:56.263636 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 01:04:56.264893 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 01:04:56.268040 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jan 23 01:04:56.270042 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 01:04:56.272681 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 01:04:56.276421 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 01:04:56.286194 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 01:04:56.286369 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 01:04:56.290618 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 01:04:56.290808 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 01:04:56.297683 systemd[1]: Finished ensure-sysext.service. Jan 23 01:04:56.305693 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 01:04:56.305851 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 01:04:56.311804 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 01:04:56.314643 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 01:04:56.318600 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 23 01:04:56.320281 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 01:04:56.320319 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 01:04:56.320378 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 01:04:56.320410 systemd[1]: Reached target time-set.target - System Time Set. Jan 23 01:04:56.320916 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 01:04:56.321179 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 01:04:56.322570 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:56.325945 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jan 23 01:04:56.331588 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 23 01:04:56.334999 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 01:04:56.340194 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 01:04:56.342462 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 01:04:56.342614 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 01:04:56.348871 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 23 01:04:56.348919 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 23 01:04:56.348973 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 01:04:56.351647 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 23 01:04:56.354661 kernel: PTP clock support registered Jan 23 01:04:56.355762 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 01:04:56.357225 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 23 01:04:56.361912 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 23 01:04:56.365669 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 23 01:04:56.379576 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 23 01:04:56.384326 augenrules[1599]: No rules Jan 23 01:04:56.385792 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 01:04:56.387562 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 01:04:56.405409 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 01:04:56.419559 systemd-resolved[1554]: Positive Trust Anchors: Jan 23 01:04:56.419571 systemd-resolved[1554]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 01:04:56.419602 systemd-resolved[1554]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 01:04:56.421745 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 23 01:04:56.422532 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 23 01:04:56.424064 systemd-resolved[1554]: Using system hostname 'ci-4459-2-2-n-615049e46b'. Jan 23 01:04:56.425348 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 01:04:56.426476 systemd[1]: Reached target network.target - Network. Jan 23 01:04:56.426911 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 01:04:56.427320 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 01:04:56.428649 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 23 01:04:56.429113 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 23 01:04:56.429557 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 23 01:04:56.430066 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 23 01:04:56.430487 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 23 01:04:56.432333 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 23 01:04:56.432746 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 23 01:04:56.432774 systemd[1]: Reached target paths.target - Path Units. Jan 23 01:04:56.433122 systemd[1]: Reached target timers.target - Timer Units. Jan 23 01:04:56.435243 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 23 01:04:56.437714 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 23 01:04:56.440648 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 23 01:04:56.442505 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 23 01:04:56.443755 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 23 01:04:56.445876 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 23 01:04:56.446626 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 23 01:04:56.447699 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 23 01:04:56.449822 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 01:04:56.450443 systemd[1]: Reached target basic.target - Basic System. Jan 23 01:04:56.451077 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 23 01:04:56.451158 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 23 01:04:56.453861 systemd[1]: Starting chronyd.service - NTP client/server... Jan 23 01:04:56.459600 systemd[1]: Starting containerd.service - containerd container runtime... Jan 23 01:04:56.464479 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 23 01:04:56.469620 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 23 01:04:56.471028 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 23 01:04:56.474352 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 23 01:04:56.481623 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 23 01:04:56.482146 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 23 01:04:56.488704 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 23 01:04:56.496631 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 23 01:04:56.501373 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 23 01:04:56.507967 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:56.508029 jq[1617]: false Jan 23 01:04:56.510460 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 23 01:04:56.515618 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 23 01:04:56.519297 chronyd[1612]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 23 01:04:56.521537 google_oslogin_nss_cache[1619]: oslogin_cache_refresh[1619]: Refreshing passwd entry cache Jan 23 01:04:56.520829 oslogin_cache_refresh[1619]: Refreshing passwd entry cache Jan 23 01:04:56.521985 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 23 01:04:56.525149 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 23 01:04:56.525251 chronyd[1612]: Loaded seccomp filter (level 2) Jan 23 01:04:56.528718 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 23 01:04:56.530784 systemd[1]: Starting update-engine.service - Update Engine... Jan 23 01:04:56.537581 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 23 01:04:56.538751 systemd[1]: Started chronyd.service - NTP client/server. Jan 23 01:04:56.540613 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 23 01:04:56.544539 google_oslogin_nss_cache[1619]: oslogin_cache_refresh[1619]: Failure getting users, quitting Jan 23 01:04:56.544539 google_oslogin_nss_cache[1619]: oslogin_cache_refresh[1619]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 23 01:04:56.544539 google_oslogin_nss_cache[1619]: oslogin_cache_refresh[1619]: Refreshing group entry cache Jan 23 01:04:56.544097 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 23 01:04:56.543143 oslogin_cache_refresh[1619]: Failure getting users, quitting Jan 23 01:04:56.543160 oslogin_cache_refresh[1619]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 23 01:04:56.543203 oslogin_cache_refresh[1619]: Refreshing group entry cache Jan 23 01:04:56.546205 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 23 01:04:56.548125 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 23 01:04:56.548292 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 23 01:04:56.550507 google_oslogin_nss_cache[1619]: oslogin_cache_refresh[1619]: Failure getting groups, quitting Jan 23 01:04:56.550507 google_oslogin_nss_cache[1619]: oslogin_cache_refresh[1619]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 23 01:04:56.550330 oslogin_cache_refresh[1619]: Failure getting groups, quitting Jan 23 01:04:56.550341 oslogin_cache_refresh[1619]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 23 01:04:56.551937 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 23 01:04:56.552407 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 23 01:04:56.562566 extend-filesystems[1618]: Found /dev/vda6 Jan 23 01:04:56.574920 update_engine[1630]: I20260123 01:04:56.572217 1630 main.cc:92] Flatcar Update Engine starting Jan 23 01:04:56.584541 tar[1635]: linux-amd64/LICENSE Jan 23 01:04:56.584969 jq[1631]: true Jan 23 01:04:56.586383 extend-filesystems[1618]: Found /dev/vda9 Jan 23 01:04:56.586856 (ntainerd)[1649]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 23 01:04:56.594612 tar[1635]: linux-amd64/helm Jan 23 01:04:56.597358 systemd[1]: motdgen.service: Deactivated successfully. Jan 23 01:04:56.597578 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 23 01:04:56.600687 extend-filesystems[1618]: Checking size of /dev/vda9 Jan 23 01:04:56.625558 extend-filesystems[1618]: Resized partition /dev/vda9 Jan 23 01:04:56.633211 extend-filesystems[1661]: resize2fs 1.47.3 (8-Jul-2025) Jan 23 01:04:56.633887 jq[1655]: true Jan 23 01:04:56.638227 dbus-daemon[1615]: [system] SELinux support is enabled Jan 23 01:04:56.639694 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 23 01:04:56.647801 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Jan 23 01:04:56.647926 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 23 01:04:56.647951 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 23 01:04:56.650145 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 23 01:04:56.650165 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 23 01:04:56.654560 systemd[1]: Started update-engine.service - Update Engine. Jan 23 01:04:56.656843 update_engine[1630]: I20260123 01:04:56.656800 1630 update_check_scheduler.cc:74] Next update check in 4m55s Jan 23 01:04:56.664152 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 23 01:04:56.701698 systemd-logind[1625]: New seat seat0. Jan 23 01:04:56.709214 systemd-logind[1625]: Watching system buttons on /dev/input/event3 (Power Button) Jan 23 01:04:56.709235 systemd-logind[1625]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 23 01:04:56.709452 systemd[1]: Started systemd-logind.service - User Login Management. Jan 23 01:04:56.837536 locksmithd[1664]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 23 01:04:56.868830 containerd[1649]: time="2026-01-23T01:04:56Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 23 01:04:56.873521 containerd[1649]: time="2026-01-23T01:04:56.873126283Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Jan 23 01:04:56.883945 containerd[1649]: time="2026-01-23T01:04:56.883909548Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.131µs" Jan 23 01:04:56.883945 containerd[1649]: time="2026-01-23T01:04:56.883942369Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 23 01:04:56.895643 containerd[1649]: time="2026-01-23T01:04:56.883983673Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 23 01:04:56.895786 containerd[1649]: time="2026-01-23T01:04:56.895760876Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 23 01:04:56.895814 containerd[1649]: time="2026-01-23T01:04:56.895800923Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 23 01:04:56.895832 containerd[1649]: time="2026-01-23T01:04:56.895826879Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 01:04:56.895883 containerd[1649]: time="2026-01-23T01:04:56.895869840Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 01:04:56.895904 containerd[1649]: time="2026-01-23T01:04:56.895884558Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896111 containerd[1649]: time="2026-01-23T01:04:56.896096810Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896162 containerd[1649]: time="2026-01-23T01:04:56.896111890Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896162 containerd[1649]: time="2026-01-23T01:04:56.896120445Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896162 containerd[1649]: time="2026-01-23T01:04:56.896127313Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896212 containerd[1649]: time="2026-01-23T01:04:56.896183705Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896744 containerd[1649]: time="2026-01-23T01:04:56.896399187Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896744 containerd[1649]: time="2026-01-23T01:04:56.896427715Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 01:04:56.896744 containerd[1649]: time="2026-01-23T01:04:56.896436295Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 23 01:04:56.897706 containerd[1649]: time="2026-01-23T01:04:56.897689985Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 23 01:04:56.898415 containerd[1649]: time="2026-01-23T01:04:56.898399802Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 23 01:04:56.900805 containerd[1649]: time="2026-01-23T01:04:56.900789030Z" level=info msg="metadata content store policy set" policy=shared Jan 23 01:04:56.944752 bash[1679]: Updated "/home/core/.ssh/authorized_keys" Jan 23 01:04:56.947555 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 23 01:04:56.953700 systemd[1]: Starting sshkeys.service... Jan 23 01:04:56.977677 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 23 01:04:56.982737 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 23 01:04:57.001540 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:57.001851 containerd[1649]: time="2026-01-23T01:04:57.001807642Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 23 01:04:57.001889 containerd[1649]: time="2026-01-23T01:04:57.001870853Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 23 01:04:57.001889 containerd[1649]: time="2026-01-23T01:04:57.001885537Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 23 01:04:57.001923 containerd[1649]: time="2026-01-23T01:04:57.001898508Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 23 01:04:57.001923 containerd[1649]: time="2026-01-23T01:04:57.001910144Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 23 01:04:57.001923 containerd[1649]: time="2026-01-23T01:04:57.001919193Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 23 01:04:57.002001 containerd[1649]: time="2026-01-23T01:04:57.001928859Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 23 01:04:57.002001 containerd[1649]: time="2026-01-23T01:04:57.001939330Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 23 01:04:57.002001 containerd[1649]: time="2026-01-23T01:04:57.001959926Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 23 01:04:57.002001 containerd[1649]: time="2026-01-23T01:04:57.001977481Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 23 01:04:57.002001 containerd[1649]: time="2026-01-23T01:04:57.001987268Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 23 01:04:57.002001 containerd[1649]: time="2026-01-23T01:04:57.002000680Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 23 01:04:57.002157 containerd[1649]: time="2026-01-23T01:04:57.002137779Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 23 01:04:57.002178 containerd[1649]: time="2026-01-23T01:04:57.002159974Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 23 01:04:57.002178 containerd[1649]: time="2026-01-23T01:04:57.002171990Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 23 01:04:57.002214 containerd[1649]: time="2026-01-23T01:04:57.002186238Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 23 01:04:57.002214 containerd[1649]: time="2026-01-23T01:04:57.002196957Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 23 01:04:57.002214 containerd[1649]: time="2026-01-23T01:04:57.002210809Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 23 01:04:57.002269 containerd[1649]: time="2026-01-23T01:04:57.002237259Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 23 01:04:57.002269 containerd[1649]: time="2026-01-23T01:04:57.002247637Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 23 01:04:57.002269 containerd[1649]: time="2026-01-23T01:04:57.002257660Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 23 01:04:57.002394 containerd[1649]: time="2026-01-23T01:04:57.002376244Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 23 01:04:57.002415 containerd[1649]: time="2026-01-23T01:04:57.002393368Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 23 01:04:57.002453 containerd[1649]: time="2026-01-23T01:04:57.002437132Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 23 01:04:57.002476 containerd[1649]: time="2026-01-23T01:04:57.002457052Z" level=info msg="Start snapshots syncer" Jan 23 01:04:57.002495 containerd[1649]: time="2026-01-23T01:04:57.002478669Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 23 01:04:57.003245 containerd[1649]: time="2026-01-23T01:04:57.002761183Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 23 01:04:57.003354 containerd[1649]: time="2026-01-23T01:04:57.003292889Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.005844647Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006325857Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006349493Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006359381Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006368387Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006395919Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006409842Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006419165Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006474955Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006486588Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006495797Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006608453Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006622726Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 01:04:57.007523 containerd[1649]: time="2026-01-23T01:04:57.006630685Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.006639564Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.006646810Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.006982886Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.007001258Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.007027306Z" level=info msg="runtime interface created" Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.007032336Z" level=info msg="created NRI interface" Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.007040092Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.007050923Z" level=info msg="Connect containerd service" Jan 23 01:04:57.007788 containerd[1649]: time="2026-01-23T01:04:57.007069763Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 23 01:04:57.008551 containerd[1649]: time="2026-01-23T01:04:57.008531306Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 01:04:57.170889 containerd[1649]: time="2026-01-23T01:04:57.170845600Z" level=info msg="Start subscribing containerd event" Jan 23 01:04:57.171540 containerd[1649]: time="2026-01-23T01:04:57.171492844Z" level=info msg="Start recovering state" Jan 23 01:04:57.171655 containerd[1649]: time="2026-01-23T01:04:57.171638007Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 23 01:04:57.171701 containerd[1649]: time="2026-01-23T01:04:57.171691539Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 23 01:04:57.173469 containerd[1649]: time="2026-01-23T01:04:57.173453202Z" level=info msg="Start event monitor" Jan 23 01:04:57.173527 containerd[1649]: time="2026-01-23T01:04:57.173476237Z" level=info msg="Start cni network conf syncer for default" Jan 23 01:04:57.173547 containerd[1649]: time="2026-01-23T01:04:57.173528026Z" level=info msg="Start streaming server" Jan 23 01:04:57.173547 containerd[1649]: time="2026-01-23T01:04:57.173542393Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 23 01:04:57.173586 containerd[1649]: time="2026-01-23T01:04:57.173548553Z" level=info msg="runtime interface starting up..." Jan 23 01:04:57.173586 containerd[1649]: time="2026-01-23T01:04:57.173554486Z" level=info msg="starting plugins..." Jan 23 01:04:57.173586 containerd[1649]: time="2026-01-23T01:04:57.173566093Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 23 01:04:57.174174 systemd[1]: Started containerd.service - containerd container runtime. Jan 23 01:04:57.176812 containerd[1649]: time="2026-01-23T01:04:57.176797354Z" level=info msg="containerd successfully booted in 0.314349s" Jan 23 01:04:57.254465 sshd_keygen[1650]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 23 01:04:57.278813 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 23 01:04:57.283209 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 23 01:04:57.303450 systemd[1]: issuegen.service: Deactivated successfully. Jan 23 01:04:57.303786 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 23 01:04:57.309540 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 23 01:04:57.322007 tar[1635]: linux-amd64/README.md Jan 23 01:04:57.341855 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 23 01:04:57.348223 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 23 01:04:57.352749 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 23 01:04:57.357115 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 23 01:04:57.358225 systemd[1]: Reached target getty.target - Login Prompts. Jan 23 01:04:57.444562 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Jan 23 01:04:57.508810 systemd-networkd[1330]: eth0: Gained IPv6LL Jan 23 01:04:57.512642 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 23 01:04:57.519812 systemd[1]: Reached target network-online.target - Network is Online. Jan 23 01:04:57.525638 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:04:57.531940 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 23 01:04:57.597626 extend-filesystems[1661]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 23 01:04:57.597626 extend-filesystems[1661]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 23 01:04:57.597626 extend-filesystems[1661]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Jan 23 01:04:57.609621 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:57.609673 extend-filesystems[1618]: Resized filesystem in /dev/vda9 Jan 23 01:04:57.602795 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 23 01:04:57.603118 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 23 01:04:57.619658 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 23 01:04:58.020563 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:04:59.177963 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:04:59.195288 (kubelet)[1749]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 01:04:59.638600 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:05:00.038554 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:05:00.041799 kubelet[1749]: E0123 01:05:00.041740 1749 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 01:05:00.046077 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 01:05:00.046388 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 01:05:00.047012 systemd[1]: kubelet.service: Consumed 1.379s CPU time, 258M memory peak. Jan 23 01:05:03.652576 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:05:03.666017 coreos-metadata[1614]: Jan 23 01:05:03.665 WARN failed to locate config-drive, using the metadata service API instead Jan 23 01:05:03.700660 coreos-metadata[1614]: Jan 23 01:05:03.700 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 23 01:05:04.056585 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 01:05:04.075570 coreos-metadata[1694]: Jan 23 01:05:04.074 WARN failed to locate config-drive, using the metadata service API instead Jan 23 01:05:04.115715 coreos-metadata[1694]: Jan 23 01:05:04.115 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 23 01:05:05.589269 coreos-metadata[1694]: Jan 23 01:05:05.589 INFO Fetch successful Jan 23 01:05:05.589269 coreos-metadata[1694]: Jan 23 01:05:05.589 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 23 01:05:05.590658 coreos-metadata[1614]: Jan 23 01:05:05.590 INFO Fetch successful Jan 23 01:05:05.591006 coreos-metadata[1614]: Jan 23 01:05:05.590 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 01:05:06.365359 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 23 01:05:06.368800 systemd[1]: Started sshd@0-10.0.5.114:22-20.161.92.111:45576.service - OpenSSH per-connection server daemon (20.161.92.111:45576). Jan 23 01:05:07.083494 sshd[1767]: Accepted publickey for core from 20.161.92.111 port 45576 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:07.085076 sshd-session[1767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:07.134906 systemd-logind[1625]: New session 1 of user core. Jan 23 01:05:07.139789 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 23 01:05:07.143026 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 23 01:05:07.189755 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 23 01:05:07.196354 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 23 01:05:07.224318 (systemd)[1772]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 23 01:05:07.230305 systemd-logind[1625]: New session c1 of user core. Jan 23 01:05:07.406336 systemd[1772]: Queued start job for default target default.target. Jan 23 01:05:07.413318 systemd[1772]: Created slice app.slice - User Application Slice. Jan 23 01:05:07.413345 systemd[1772]: Reached target paths.target - Paths. Jan 23 01:05:07.413382 systemd[1772]: Reached target timers.target - Timers. Jan 23 01:05:07.414635 systemd[1772]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 23 01:05:07.449422 systemd[1772]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 23 01:05:07.449502 systemd[1772]: Reached target sockets.target - Sockets. Jan 23 01:05:07.449573 systemd[1772]: Reached target basic.target - Basic System. Jan 23 01:05:07.449624 systemd[1772]: Reached target default.target - Main User Target. Jan 23 01:05:07.449664 systemd[1772]: Startup finished in 206ms. Jan 23 01:05:07.450500 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 23 01:05:07.460023 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 23 01:05:07.935157 systemd[1]: Started sshd@1-10.0.5.114:22-20.161.92.111:45590.service - OpenSSH per-connection server daemon (20.161.92.111:45590). Jan 23 01:05:08.607574 sshd[1783]: Accepted publickey for core from 20.161.92.111 port 45590 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:08.608995 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:08.617936 systemd-logind[1625]: New session 2 of user core. Jan 23 01:05:08.623053 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 23 01:05:08.636144 coreos-metadata[1694]: Jan 23 01:05:08.635 INFO Fetch successful Jan 23 01:05:08.639350 unknown[1694]: wrote ssh authorized keys file for user: core Jan 23 01:05:08.653136 coreos-metadata[1614]: Jan 23 01:05:08.653 INFO Fetch successful Jan 23 01:05:08.653136 coreos-metadata[1614]: Jan 23 01:05:08.653 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 23 01:05:08.670784 update-ssh-keys[1787]: Updated "/home/core/.ssh/authorized_keys" Jan 23 01:05:08.671949 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 23 01:05:08.674146 systemd[1]: Finished sshkeys.service. Jan 23 01:05:09.052324 sshd[1786]: Connection closed by 20.161.92.111 port 45590 Jan 23 01:05:09.053423 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Jan 23 01:05:09.063319 systemd[1]: sshd@1-10.0.5.114:22-20.161.92.111:45590.service: Deactivated successfully. Jan 23 01:05:09.068747 systemd[1]: session-2.scope: Deactivated successfully. Jan 23 01:05:09.070985 systemd-logind[1625]: Session 2 logged out. Waiting for processes to exit. Jan 23 01:05:09.074450 systemd-logind[1625]: Removed session 2. Jan 23 01:05:09.164886 systemd[1]: Started sshd@2-10.0.5.114:22-20.161.92.111:45596.service - OpenSSH per-connection server daemon (20.161.92.111:45596). Jan 23 01:05:09.232587 coreos-metadata[1614]: Jan 23 01:05:09.232 INFO Fetch successful Jan 23 01:05:09.232587 coreos-metadata[1614]: Jan 23 01:05:09.232 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 23 01:05:09.807447 coreos-metadata[1614]: Jan 23 01:05:09.807 INFO Fetch successful Jan 23 01:05:09.807447 coreos-metadata[1614]: Jan 23 01:05:09.807 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 23 01:05:09.817369 sshd[1796]: Accepted publickey for core from 20.161.92.111 port 45596 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:09.820416 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:09.832606 systemd-logind[1625]: New session 3 of user core. Jan 23 01:05:09.843872 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 23 01:05:10.088733 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 23 01:05:10.092585 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:10.256468 sshd[1799]: Connection closed by 20.161.92.111 port 45596 Jan 23 01:05:10.257428 sshd-session[1796]: pam_unix(sshd:session): session closed for user core Jan 23 01:05:10.265274 systemd[1]: sshd@2-10.0.5.114:22-20.161.92.111:45596.service: Deactivated successfully. Jan 23 01:05:10.271386 systemd[1]: session-3.scope: Deactivated successfully. Jan 23 01:05:10.275089 systemd-logind[1625]: Session 3 logged out. Waiting for processes to exit. Jan 23 01:05:10.279794 systemd-logind[1625]: Removed session 3. Jan 23 01:05:10.323886 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:10.338020 (kubelet)[1812]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 01:05:10.386885 coreos-metadata[1614]: Jan 23 01:05:10.386 INFO Fetch successful Jan 23 01:05:10.386885 coreos-metadata[1614]: Jan 23 01:05:10.386 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 23 01:05:10.645481 kubelet[1812]: E0123 01:05:10.645270 1812 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 01:05:10.653595 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 01:05:10.653993 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 01:05:10.655180 systemd[1]: kubelet.service: Consumed 301ms CPU time, 109.6M memory peak. Jan 23 01:05:11.867000 coreos-metadata[1614]: Jan 23 01:05:11.866 INFO Fetch successful Jan 23 01:05:11.903842 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 23 01:05:11.904570 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 23 01:05:11.904708 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 23 01:05:11.905431 systemd[1]: Startup finished in 4.006s (kernel) + 15.537s (initrd) + 18.753s (userspace) = 38.296s. Jan 23 01:05:20.317855 chronyd[1612]: Selected source PHC0 Jan 23 01:05:20.381420 systemd[1]: Started sshd@3-10.0.5.114:22-20.161.92.111:50054.service - OpenSSH per-connection server daemon (20.161.92.111:50054). Jan 23 01:05:20.839133 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 23 01:05:20.843149 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:21.044164 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:21.053849 (kubelet)[1836]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 01:05:21.089834 kubelet[1836]: E0123 01:05:21.089644 1836 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 01:05:21.096308 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 01:05:21.097223 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 01:05:21.098502 systemd[1]: kubelet.service: Consumed 180ms CPU time, 110.3M memory peak. Jan 23 01:05:21.102157 sshd[1825]: Accepted publickey for core from 20.161.92.111 port 50054 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:21.103907 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:21.112445 systemd-logind[1625]: New session 4 of user core. Jan 23 01:05:21.134070 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 23 01:05:21.575645 sshd[1843]: Connection closed by 20.161.92.111 port 50054 Jan 23 01:05:21.576809 sshd-session[1825]: pam_unix(sshd:session): session closed for user core Jan 23 01:05:21.585448 systemd[1]: sshd@3-10.0.5.114:22-20.161.92.111:50054.service: Deactivated successfully. Jan 23 01:05:21.589672 systemd[1]: session-4.scope: Deactivated successfully. Jan 23 01:05:21.593111 systemd-logind[1625]: Session 4 logged out. Waiting for processes to exit. Jan 23 01:05:21.596758 systemd-logind[1625]: Removed session 4. Jan 23 01:05:21.709369 systemd[1]: Started sshd@4-10.0.5.114:22-20.161.92.111:50068.service - OpenSSH per-connection server daemon (20.161.92.111:50068). Jan 23 01:05:22.401637 sshd[1849]: Accepted publickey for core from 20.161.92.111 port 50068 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:22.404994 sshd-session[1849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:22.418405 systemd-logind[1625]: New session 5 of user core. Jan 23 01:05:22.440047 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 23 01:05:22.864769 sshd[1852]: Connection closed by 20.161.92.111 port 50068 Jan 23 01:05:22.866161 sshd-session[1849]: pam_unix(sshd:session): session closed for user core Jan 23 01:05:22.876346 systemd[1]: sshd@4-10.0.5.114:22-20.161.92.111:50068.service: Deactivated successfully. Jan 23 01:05:22.882226 systemd[1]: session-5.scope: Deactivated successfully. Jan 23 01:05:22.884379 systemd-logind[1625]: Session 5 logged out. Waiting for processes to exit. Jan 23 01:05:22.887439 systemd-logind[1625]: Removed session 5. Jan 23 01:05:22.996660 systemd[1]: Started sshd@5-10.0.5.114:22-20.161.92.111:39214.service - OpenSSH per-connection server daemon (20.161.92.111:39214). Jan 23 01:05:23.732162 sshd[1858]: Accepted publickey for core from 20.161.92.111 port 39214 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:23.735320 sshd-session[1858]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:23.747637 systemd-logind[1625]: New session 6 of user core. Jan 23 01:05:23.760936 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 23 01:05:24.219752 sshd[1861]: Connection closed by 20.161.92.111 port 39214 Jan 23 01:05:24.220999 sshd-session[1858]: pam_unix(sshd:session): session closed for user core Jan 23 01:05:24.231059 systemd[1]: sshd@5-10.0.5.114:22-20.161.92.111:39214.service: Deactivated successfully. Jan 23 01:05:24.235240 systemd[1]: session-6.scope: Deactivated successfully. Jan 23 01:05:24.237750 systemd-logind[1625]: Session 6 logged out. Waiting for processes to exit. Jan 23 01:05:24.242276 systemd-logind[1625]: Removed session 6. Jan 23 01:05:24.347207 systemd[1]: Started sshd@6-10.0.5.114:22-20.161.92.111:39220.service - OpenSSH per-connection server daemon (20.161.92.111:39220). Jan 23 01:05:25.061670 sshd[1867]: Accepted publickey for core from 20.161.92.111 port 39220 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:25.064138 sshd-session[1867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:25.076577 systemd-logind[1625]: New session 7 of user core. Jan 23 01:05:25.089016 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 23 01:05:25.462143 sudo[1871]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 23 01:05:25.462578 sudo[1871]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 01:05:25.492076 sudo[1871]: pam_unix(sudo:session): session closed for user root Jan 23 01:05:25.597687 sshd[1870]: Connection closed by 20.161.92.111 port 39220 Jan 23 01:05:25.599267 sshd-session[1867]: pam_unix(sshd:session): session closed for user core Jan 23 01:05:25.609659 systemd[1]: sshd@6-10.0.5.114:22-20.161.92.111:39220.service: Deactivated successfully. Jan 23 01:05:25.613955 systemd[1]: session-7.scope: Deactivated successfully. Jan 23 01:05:25.616111 systemd-logind[1625]: Session 7 logged out. Waiting for processes to exit. Jan 23 01:05:25.620001 systemd-logind[1625]: Removed session 7. Jan 23 01:05:25.725008 systemd[1]: Started sshd@7-10.0.5.114:22-20.161.92.111:39232.service - OpenSSH per-connection server daemon (20.161.92.111:39232). Jan 23 01:05:26.433693 sshd[1877]: Accepted publickey for core from 20.161.92.111 port 39232 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:26.436264 sshd-session[1877]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:26.450699 systemd-logind[1625]: New session 8 of user core. Jan 23 01:05:26.457901 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 23 01:05:26.810605 sudo[1882]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 23 01:05:26.811841 sudo[1882]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 01:05:26.819840 sudo[1882]: pam_unix(sudo:session): session closed for user root Jan 23 01:05:26.827711 sudo[1881]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 23 01:05:26.828089 sudo[1881]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 01:05:26.858342 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 01:05:26.925213 augenrules[1904]: No rules Jan 23 01:05:26.926571 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 01:05:26.926972 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 01:05:26.928160 sudo[1881]: pam_unix(sudo:session): session closed for user root Jan 23 01:05:27.035655 sshd[1880]: Connection closed by 20.161.92.111 port 39232 Jan 23 01:05:27.036884 sshd-session[1877]: pam_unix(sshd:session): session closed for user core Jan 23 01:05:27.046319 systemd-logind[1625]: Session 8 logged out. Waiting for processes to exit. Jan 23 01:05:27.046518 systemd[1]: sshd@7-10.0.5.114:22-20.161.92.111:39232.service: Deactivated successfully. Jan 23 01:05:27.050762 systemd[1]: session-8.scope: Deactivated successfully. Jan 23 01:05:27.056429 systemd-logind[1625]: Removed session 8. Jan 23 01:05:27.173558 systemd[1]: Started sshd@8-10.0.5.114:22-20.161.92.111:39248.service - OpenSSH per-connection server daemon (20.161.92.111:39248). Jan 23 01:05:27.885600 sshd[1913]: Accepted publickey for core from 20.161.92.111 port 39248 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:05:27.887485 sshd-session[1913]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:05:27.898931 systemd-logind[1625]: New session 9 of user core. Jan 23 01:05:27.910850 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 23 01:05:28.236465 sudo[1917]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 23 01:05:28.237417 sudo[1917]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 01:05:28.939389 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 23 01:05:28.962133 (dockerd)[1934]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 23 01:05:29.367821 dockerd[1934]: time="2026-01-23T01:05:29.367775299Z" level=info msg="Starting up" Jan 23 01:05:29.369011 dockerd[1934]: time="2026-01-23T01:05:29.368991955Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 23 01:05:29.388268 dockerd[1934]: time="2026-01-23T01:05:29.388198455Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 23 01:05:29.472037 dockerd[1934]: time="2026-01-23T01:05:29.471806546Z" level=info msg="Loading containers: start." Jan 23 01:05:29.486552 kernel: Initializing XFRM netlink socket Jan 23 01:05:29.797398 systemd-networkd[1330]: docker0: Link UP Jan 23 01:05:29.807188 dockerd[1934]: time="2026-01-23T01:05:29.806633040Z" level=info msg="Loading containers: done." Jan 23 01:05:29.822997 dockerd[1934]: time="2026-01-23T01:05:29.822959292Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 23 01:05:29.823211 dockerd[1934]: time="2026-01-23T01:05:29.823192943Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 23 01:05:29.823373 dockerd[1934]: time="2026-01-23T01:05:29.823357762Z" level=info msg="Initializing buildkit" Jan 23 01:05:29.871689 dockerd[1934]: time="2026-01-23T01:05:29.871647846Z" level=info msg="Completed buildkit initialization" Jan 23 01:05:29.876358 dockerd[1934]: time="2026-01-23T01:05:29.876325431Z" level=info msg="Daemon has completed initialization" Jan 23 01:05:29.876642 dockerd[1934]: time="2026-01-23T01:05:29.876583410Z" level=info msg="API listen on /run/docker.sock" Jan 23 01:05:29.877660 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 23 01:05:31.338693 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 23 01:05:31.342346 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:31.566100 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:31.583816 (kubelet)[2155]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 01:05:31.610578 containerd[1649]: time="2026-01-23T01:05:31.609894775Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Jan 23 01:05:31.660012 kubelet[2155]: E0123 01:05:31.659958 2155 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 01:05:31.663212 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 01:05:31.663443 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 01:05:31.664328 systemd[1]: kubelet.service: Consumed 265ms CPU time, 110.4M memory peak. Jan 23 01:05:32.790181 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3488470642.mount: Deactivated successfully. Jan 23 01:05:33.906355 containerd[1649]: time="2026-01-23T01:05:33.906300540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:33.908139 containerd[1649]: time="2026-01-23T01:05:33.907924983Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=27068171" Jan 23 01:05:33.909522 containerd[1649]: time="2026-01-23T01:05:33.909486925Z" level=info msg="ImageCreate event name:\"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:33.912406 containerd[1649]: time="2026-01-23T01:05:33.912384230Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:33.913192 containerd[1649]: time="2026-01-23T01:05:33.913171968Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"27064672\" in 2.303196692s" Jan 23 01:05:33.913242 containerd[1649]: time="2026-01-23T01:05:33.913199549Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\"" Jan 23 01:05:33.913797 containerd[1649]: time="2026-01-23T01:05:33.913776428Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Jan 23 01:05:35.217299 containerd[1649]: time="2026-01-23T01:05:35.217248603Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:35.218918 containerd[1649]: time="2026-01-23T01:05:35.218703216Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=21162460" Jan 23 01:05:35.221389 containerd[1649]: time="2026-01-23T01:05:35.221368781Z" level=info msg="ImageCreate event name:\"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:35.224392 containerd[1649]: time="2026-01-23T01:05:35.224371580Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:35.225103 containerd[1649]: time="2026-01-23T01:05:35.225084040Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"22819474\" in 1.311285738s" Jan 23 01:05:35.225146 containerd[1649]: time="2026-01-23T01:05:35.225108526Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\"" Jan 23 01:05:35.225432 containerd[1649]: time="2026-01-23T01:05:35.225418754Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Jan 23 01:05:36.239411 containerd[1649]: time="2026-01-23T01:05:36.238670642Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:36.239990 containerd[1649]: time="2026-01-23T01:05:36.239970669Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=15725947" Jan 23 01:05:36.241673 containerd[1649]: time="2026-01-23T01:05:36.241655831Z" level=info msg="ImageCreate event name:\"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:36.244393 containerd[1649]: time="2026-01-23T01:05:36.244373313Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:36.245142 containerd[1649]: time="2026-01-23T01:05:36.245043298Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"17382979\" in 1.019603993s" Jan 23 01:05:36.245142 containerd[1649]: time="2026-01-23T01:05:36.245072806Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\"" Jan 23 01:05:36.245687 containerd[1649]: time="2026-01-23T01:05:36.245623746Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Jan 23 01:05:37.629505 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount472829730.mount: Deactivated successfully. Jan 23 01:05:37.881876 containerd[1649]: time="2026-01-23T01:05:37.881761802Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:37.883777 containerd[1649]: time="2026-01-23T01:05:37.883754524Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=25965319" Jan 23 01:05:37.885298 containerd[1649]: time="2026-01-23T01:05:37.885279881Z" level=info msg="ImageCreate event name:\"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:37.888139 containerd[1649]: time="2026-01-23T01:05:37.888119279Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:37.888506 containerd[1649]: time="2026-01-23T01:05:37.888421599Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"25964312\" in 1.642775587s" Jan 23 01:05:37.888584 containerd[1649]: time="2026-01-23T01:05:37.888573518Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\"" Jan 23 01:05:37.889297 containerd[1649]: time="2026-01-23T01:05:37.889209103Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Jan 23 01:05:38.567565 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount179506317.mount: Deactivated successfully. Jan 23 01:05:39.447078 containerd[1649]: time="2026-01-23T01:05:39.447018943Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:39.448983 containerd[1649]: time="2026-01-23T01:05:39.448954258Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388099" Jan 23 01:05:39.451174 containerd[1649]: time="2026-01-23T01:05:39.451148024Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:39.454532 containerd[1649]: time="2026-01-23T01:05:39.454045553Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:39.455201 containerd[1649]: time="2026-01-23T01:05:39.455056165Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.56564544s" Jan 23 01:05:39.455201 containerd[1649]: time="2026-01-23T01:05:39.455082372Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Jan 23 01:05:39.456082 containerd[1649]: time="2026-01-23T01:05:39.456064393Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Jan 23 01:05:40.109587 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount111593140.mount: Deactivated successfully. Jan 23 01:05:40.119036 containerd[1649]: time="2026-01-23T01:05:40.118936815Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:40.122293 containerd[1649]: time="2026-01-23T01:05:40.122225827Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321238" Jan 23 01:05:40.124569 containerd[1649]: time="2026-01-23T01:05:40.124493047Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:40.129953 containerd[1649]: time="2026-01-23T01:05:40.129808836Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:40.131593 containerd[1649]: time="2026-01-23T01:05:40.131011820Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 674.918636ms" Jan 23 01:05:40.131593 containerd[1649]: time="2026-01-23T01:05:40.131070684Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Jan 23 01:05:40.132670 containerd[1649]: time="2026-01-23T01:05:40.132370443Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Jan 23 01:05:40.839623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3105690621.mount: Deactivated successfully. Jan 23 01:05:41.610878 update_engine[1630]: I20260123 01:05:41.610790 1630 update_attempter.cc:509] Updating boot flags... Jan 23 01:05:41.692313 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 23 01:05:41.707401 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:43.788365 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:43.799928 (kubelet)[2371]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 01:05:43.866664 kubelet[2371]: E0123 01:05:43.866625 2371 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 01:05:43.869346 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 01:05:43.869475 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 01:05:43.870865 systemd[1]: kubelet.service: Consumed 232ms CPU time, 110.6M memory peak. Jan 23 01:05:45.273549 containerd[1649]: time="2026-01-23T01:05:45.273142714Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:45.276497 containerd[1649]: time="2026-01-23T01:05:45.276220944Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=74166870" Jan 23 01:05:45.278882 containerd[1649]: time="2026-01-23T01:05:45.277643252Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:46.350808 containerd[1649]: time="2026-01-23T01:05:46.350710420Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:05:46.353417 containerd[1649]: time="2026-01-23T01:05:46.353339794Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 6.220919642s" Jan 23 01:05:46.353498 containerd[1649]: time="2026-01-23T01:05:46.353415476Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Jan 23 01:05:49.501595 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:49.501735 systemd[1]: kubelet.service: Consumed 232ms CPU time, 110.6M memory peak. Jan 23 01:05:49.504469 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:49.529321 systemd[1]: Reload requested from client PID 2408 ('systemctl') (unit session-9.scope)... Jan 23 01:05:49.529335 systemd[1]: Reloading... Jan 23 01:05:49.633559 zram_generator::config[2450]: No configuration found. Jan 23 01:05:49.822055 systemd[1]: Reloading finished in 292 ms. Jan 23 01:05:49.874463 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 23 01:05:49.874548 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 23 01:05:49.874872 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:49.874913 systemd[1]: kubelet.service: Consumed 99ms CPU time, 98.2M memory peak. Jan 23 01:05:49.877092 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:50.566639 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:50.578770 (kubelet)[2503]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 01:05:50.615922 kubelet[2503]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 01:05:50.615922 kubelet[2503]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 01:05:50.616438 kubelet[2503]: I0123 01:05:50.615962 2503 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 01:05:51.265536 kubelet[2503]: I0123 01:05:51.265452 2503 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 23 01:05:51.266278 kubelet[2503]: I0123 01:05:51.265706 2503 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 01:05:51.266278 kubelet[2503]: I0123 01:05:51.265766 2503 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 23 01:05:51.266278 kubelet[2503]: I0123 01:05:51.265776 2503 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 01:05:51.266278 kubelet[2503]: I0123 01:05:51.266006 2503 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 01:05:51.275189 kubelet[2503]: E0123 01:05:51.275048 2503 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.5.114:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 23 01:05:51.275478 kubelet[2503]: I0123 01:05:51.275449 2503 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 01:05:51.286442 kubelet[2503]: I0123 01:05:51.286311 2503 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 01:05:51.289117 kubelet[2503]: I0123 01:05:51.289091 2503 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 23 01:05:51.292848 kubelet[2503]: I0123 01:05:51.292774 2503 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 01:05:51.293032 kubelet[2503]: I0123 01:05:51.292822 2503 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-n-615049e46b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 01:05:51.293032 kubelet[2503]: I0123 01:05:51.293028 2503 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 01:05:51.293368 kubelet[2503]: I0123 01:05:51.293042 2503 container_manager_linux.go:306] "Creating device plugin manager" Jan 23 01:05:51.293368 kubelet[2503]: I0123 01:05:51.293141 2503 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 23 01:05:51.298815 kubelet[2503]: I0123 01:05:51.298741 2503 state_mem.go:36] "Initialized new in-memory state store" Jan 23 01:05:51.299000 kubelet[2503]: I0123 01:05:51.298942 2503 kubelet.go:475] "Attempting to sync node with API server" Jan 23 01:05:51.299000 kubelet[2503]: I0123 01:05:51.298961 2503 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 01:05:51.299000 kubelet[2503]: I0123 01:05:51.298986 2503 kubelet.go:387] "Adding apiserver pod source" Jan 23 01:05:51.299000 kubelet[2503]: I0123 01:05:51.299007 2503 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 01:05:51.308091 kubelet[2503]: E0123 01:05:51.307288 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.5.114:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 23 01:05:51.308091 kubelet[2503]: E0123 01:05:51.307592 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.5.114:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-n-615049e46b&limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 23 01:05:51.311834 kubelet[2503]: I0123 01:05:51.311798 2503 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Jan 23 01:05:51.312551 kubelet[2503]: I0123 01:05:51.312351 2503 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 01:05:51.312551 kubelet[2503]: I0123 01:05:51.312387 2503 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 23 01:05:51.312551 kubelet[2503]: W0123 01:05:51.312437 2503 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 23 01:05:51.319075 kubelet[2503]: I0123 01:05:51.319012 2503 server.go:1262] "Started kubelet" Jan 23 01:05:51.319276 kubelet[2503]: I0123 01:05:51.319216 2503 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 01:05:51.321777 kubelet[2503]: I0123 01:05:51.321737 2503 server.go:310] "Adding debug handlers to kubelet server" Jan 23 01:05:51.328961 kubelet[2503]: I0123 01:05:51.327751 2503 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 01:05:51.328961 kubelet[2503]: I0123 01:05:51.327798 2503 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 23 01:05:51.328961 kubelet[2503]: I0123 01:05:51.328045 2503 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 01:05:51.332141 kubelet[2503]: E0123 01:05:51.328220 2503 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.5.114:6443/api/v1/namespaces/default/events\": dial tcp 10.0.5.114:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-n-615049e46b.188d36ae67bad740 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-n-615049e46b,UID:ci-4459-2-2-n-615049e46b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-n-615049e46b,},FirstTimestamp:2026-01-23 01:05:51.318980416 +0000 UTC m=+0.736847540,LastTimestamp:2026-01-23 01:05:51.318980416 +0000 UTC m=+0.736847540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-n-615049e46b,}" Jan 23 01:05:51.335954 kubelet[2503]: I0123 01:05:51.335120 2503 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 01:05:51.336623 kubelet[2503]: I0123 01:05:51.336102 2503 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 01:05:51.337964 kubelet[2503]: I0123 01:05:51.337900 2503 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 23 01:05:51.343722 kubelet[2503]: I0123 01:05:51.343693 2503 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 23 01:05:51.343872 kubelet[2503]: I0123 01:05:51.343737 2503 reconciler.go:29] "Reconciler: start to sync state" Jan 23 01:05:51.344793 kubelet[2503]: E0123 01:05:51.344489 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.5.114:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 23 01:05:51.344793 kubelet[2503]: E0123 01:05:51.344765 2503 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-615049e46b\" not found" Jan 23 01:05:51.344999 kubelet[2503]: E0123 01:05:51.344833 2503 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.114:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-615049e46b?timeout=10s\": dial tcp 10.0.5.114:6443: connect: connection refused" interval="200ms" Jan 23 01:05:51.345558 kubelet[2503]: I0123 01:05:51.345507 2503 factory.go:223] Registration of the systemd container factory successfully Jan 23 01:05:51.346654 kubelet[2503]: I0123 01:05:51.345611 2503 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 01:05:51.346654 kubelet[2503]: E0123 01:05:51.346470 2503 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 01:05:51.347188 kubelet[2503]: I0123 01:05:51.346859 2503 factory.go:223] Registration of the containerd container factory successfully Jan 23 01:05:51.385084 kubelet[2503]: I0123 01:05:51.385007 2503 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 23 01:05:51.387442 kubelet[2503]: I0123 01:05:51.387392 2503 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 23 01:05:51.387442 kubelet[2503]: I0123 01:05:51.387434 2503 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 23 01:05:51.387442 kubelet[2503]: I0123 01:05:51.387453 2503 kubelet.go:2427] "Starting kubelet main sync loop" Jan 23 01:05:51.387652 kubelet[2503]: E0123 01:05:51.387491 2503 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 01:05:51.388628 kubelet[2503]: E0123 01:05:51.388591 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.5.114:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 23 01:05:51.389114 kubelet[2503]: I0123 01:05:51.389094 2503 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 01:05:51.389203 kubelet[2503]: I0123 01:05:51.389190 2503 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 01:05:51.389295 kubelet[2503]: I0123 01:05:51.389286 2503 state_mem.go:36] "Initialized new in-memory state store" Jan 23 01:05:51.395629 kubelet[2503]: I0123 01:05:51.395607 2503 policy_none.go:49] "None policy: Start" Jan 23 01:05:51.395781 kubelet[2503]: I0123 01:05:51.395765 2503 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 23 01:05:51.395863 kubelet[2503]: I0123 01:05:51.395850 2503 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 23 01:05:51.401300 kubelet[2503]: I0123 01:05:51.400362 2503 policy_none.go:47] "Start" Jan 23 01:05:51.406548 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 23 01:05:51.418314 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 23 01:05:51.423204 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 23 01:05:51.432494 kubelet[2503]: E0123 01:05:51.432449 2503 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 01:05:51.432679 kubelet[2503]: I0123 01:05:51.432652 2503 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 01:05:51.432751 kubelet[2503]: I0123 01:05:51.432667 2503 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 01:05:51.433778 kubelet[2503]: I0123 01:05:51.433416 2503 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 01:05:51.435905 kubelet[2503]: E0123 01:05:51.435881 2503 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 01:05:51.435993 kubelet[2503]: E0123 01:05:51.435915 2503 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-n-615049e46b\" not found" Jan 23 01:05:51.693236 kubelet[2503]: I0123 01:05:51.535170 2503 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.693236 kubelet[2503]: E0123 01:05:51.535580 2503 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.5.114:6443/api/v1/nodes\": dial tcp 10.0.5.114:6443: connect: connection refused" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.693236 kubelet[2503]: E0123 01:05:51.546152 2503 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.114:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-615049e46b?timeout=10s\": dial tcp 10.0.5.114:6443: connect: connection refused" interval="400ms" Jan 23 01:05:51.693236 kubelet[2503]: I0123 01:05:51.645880 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.693236 kubelet[2503]: I0123 01:05:51.645967 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.693236 kubelet[2503]: I0123 01:05:51.646014 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.694195 kubelet[2503]: I0123 01:05:51.646052 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.694195 kubelet[2503]: I0123 01:05:51.646112 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.714126 systemd[1]: Created slice kubepods-burstable-pod5cc9b1ec51b022dcf1e2ac5f7e1a2376.slice - libcontainer container kubepods-burstable-pod5cc9b1ec51b022dcf1e2ac5f7e1a2376.slice. Jan 23 01:05:51.733356 kubelet[2503]: E0123 01:05:51.733013 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.738838 kubelet[2503]: I0123 01:05:51.738808 2503 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.739592 kubelet[2503]: E0123 01:05:51.739423 2503 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.5.114:6443/api/v1/nodes\": dial tcp 10.0.5.114:6443: connect: connection refused" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.747041 kubelet[2503]: I0123 01:05:51.747003 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60fec7243a1c75d328bc4b48ffe4f935-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-n-615049e46b\" (UID: \"60fec7243a1c75d328bc4b48ffe4f935\") " pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" Jan 23 01:05:51.947732 kubelet[2503]: E0123 01:05:51.947491 2503 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.114:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-615049e46b?timeout=10s\": dial tcp 10.0.5.114:6443: connect: connection refused" interval="800ms" Jan 23 01:05:52.143453 kubelet[2503]: I0123 01:05:52.143376 2503 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.144437 kubelet[2503]: E0123 01:05:52.144380 2503 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.5.114:6443/api/v1/nodes\": dial tcp 10.0.5.114:6443: connect: connection refused" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.459056 kubelet[2503]: E0123 01:05:52.458775 2503 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.5.114:6443/api/v1/namespaces/default/events\": dial tcp 10.0.5.114:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-n-615049e46b.188d36ae67bad740 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-n-615049e46b,UID:ci-4459-2-2-n-615049e46b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-n-615049e46b,},FirstTimestamp:2026-01-23 01:05:51.318980416 +0000 UTC m=+0.736847540,LastTimestamp:2026-01-23 01:05:51.318980416 +0000 UTC m=+0.736847540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-n-615049e46b,}" Jan 23 01:05:52.642709 kubelet[2503]: E0123 01:05:52.642630 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.5.114:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-n-615049e46b&limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 23 01:05:52.664661 kubelet[2503]: E0123 01:05:52.664590 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.5.114:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 23 01:05:52.825891 kubelet[2503]: E0123 01:05:52.722448 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.5.114:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 23 01:05:52.825891 kubelet[2503]: E0123 01:05:52.748485 2503 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.5.114:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-615049e46b?timeout=10s\": dial tcp 10.0.5.114:6443: connect: connection refused" interval="1.6s" Jan 23 01:05:52.832345 containerd[1649]: time="2026-01-23T01:05:52.831299758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-n-615049e46b,Uid:5cc9b1ec51b022dcf1e2ac5f7e1a2376,Namespace:kube-system,Attempt:0,}" Jan 23 01:05:52.851463 systemd[1]: Created slice kubepods-burstable-pod60fec7243a1c75d328bc4b48ffe4f935.slice - libcontainer container kubepods-burstable-pod60fec7243a1c75d328bc4b48ffe4f935.slice. Jan 23 01:05:52.855569 kubelet[2503]: I0123 01:05:52.854820 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9308210fbd41571fa9ff2c37ae11c3e8-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-n-615049e46b\" (UID: \"9308210fbd41571fa9ff2c37ae11c3e8\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.855569 kubelet[2503]: I0123 01:05:52.855339 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9308210fbd41571fa9ff2c37ae11c3e8-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-n-615049e46b\" (UID: \"9308210fbd41571fa9ff2c37ae11c3e8\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.855905 kubelet[2503]: I0123 01:05:52.855848 2503 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9308210fbd41571fa9ff2c37ae11c3e8-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-n-615049e46b\" (UID: \"9308210fbd41571fa9ff2c37ae11c3e8\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.870778 kubelet[2503]: E0123 01:05:52.869919 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.876426 containerd[1649]: time="2026-01-23T01:05:52.876362110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-n-615049e46b,Uid:60fec7243a1c75d328bc4b48ffe4f935,Namespace:kube-system,Attempt:0,}" Jan 23 01:05:52.880235 systemd[1]: Created slice kubepods-burstable-pod9308210fbd41571fa9ff2c37ae11c3e8.slice - libcontainer container kubepods-burstable-pod9308210fbd41571fa9ff2c37ae11c3e8.slice. Jan 23 01:05:52.886271 kubelet[2503]: E0123 01:05:52.886229 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.947713 kubelet[2503]: I0123 01:05:52.947659 2503 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.948578 kubelet[2503]: E0123 01:05:52.948468 2503 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.5.114:6443/api/v1/nodes\": dial tcp 10.0.5.114:6443: connect: connection refused" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:52.954021 kubelet[2503]: E0123 01:05:52.953975 2503 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.5.114:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 23 01:05:53.192955 containerd[1649]: time="2026-01-23T01:05:53.192871062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-n-615049e46b,Uid:9308210fbd41571fa9ff2c37ae11c3e8,Namespace:kube-system,Attempt:0,}" Jan 23 01:05:53.336236 kubelet[2503]: E0123 01:05:53.336137 2503 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.5.114:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.5.114:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 23 01:05:53.906737 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3580460819.mount: Deactivated successfully. Jan 23 01:05:53.915958 containerd[1649]: time="2026-01-23T01:05:53.915891784Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 01:05:53.920559 containerd[1649]: time="2026-01-23T01:05:53.919777821Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 01:05:53.923204 containerd[1649]: time="2026-01-23T01:05:53.923151243Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321158" Jan 23 01:05:53.924506 containerd[1649]: time="2026-01-23T01:05:53.924463864Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 01:05:53.927119 containerd[1649]: time="2026-01-23T01:05:53.927081169Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 01:05:53.928918 containerd[1649]: time="2026-01-23T01:05:53.928877750Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 01:05:53.930804 containerd[1649]: time="2026-01-23T01:05:53.930466675Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 01:05:53.935083 containerd[1649]: time="2026-01-23T01:05:53.933888443Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 01:05:53.935083 containerd[1649]: time="2026-01-23T01:05:53.934476730Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.096368215s" Jan 23 01:05:53.936450 containerd[1649]: time="2026-01-23T01:05:53.936398977Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 740.517988ms" Jan 23 01:05:53.936970 containerd[1649]: time="2026-01-23T01:05:53.936886584Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.057168565s" Jan 23 01:05:53.997292 containerd[1649]: time="2026-01-23T01:05:53.997242235Z" level=info msg="connecting to shim c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1" address="unix:///run/containerd/s/2484a299f42437d70d09f69af8414a9bdfc61bbe51ddde82ef6e3dbb06eaeb2b" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:05:54.000417 containerd[1649]: time="2026-01-23T01:05:54.000320737Z" level=info msg="connecting to shim f7cd0ac8404812ceb44eef2199e5a8bd4f512746fcafef6eb5d4b6094e37c16f" address="unix:///run/containerd/s/e332426a1f55a5c69c5289e089457d1f8eaf204426184d8fb774a06c34be154a" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:05:54.010286 containerd[1649]: time="2026-01-23T01:05:54.010237504Z" level=info msg="connecting to shim 063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3" address="unix:///run/containerd/s/1cff102426f9383e20572203ff4cdc8c69269f3edd5955a4c11bb9dbd611c080" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:05:54.036745 systemd[1]: Started cri-containerd-c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1.scope - libcontainer container c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1. Jan 23 01:05:54.043979 systemd[1]: Started cri-containerd-f7cd0ac8404812ceb44eef2199e5a8bd4f512746fcafef6eb5d4b6094e37c16f.scope - libcontainer container f7cd0ac8404812ceb44eef2199e5a8bd4f512746fcafef6eb5d4b6094e37c16f. Jan 23 01:05:54.052165 systemd[1]: Started cri-containerd-063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3.scope - libcontainer container 063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3. Jan 23 01:05:54.110066 containerd[1649]: time="2026-01-23T01:05:54.110010720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-n-615049e46b,Uid:9308210fbd41571fa9ff2c37ae11c3e8,Namespace:kube-system,Attempt:0,} returns sandbox id \"f7cd0ac8404812ceb44eef2199e5a8bd4f512746fcafef6eb5d4b6094e37c16f\"" Jan 23 01:05:54.115339 containerd[1649]: time="2026-01-23T01:05:54.115290954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-n-615049e46b,Uid:5cc9b1ec51b022dcf1e2ac5f7e1a2376,Namespace:kube-system,Attempt:0,} returns sandbox id \"c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1\"" Jan 23 01:05:54.119470 containerd[1649]: time="2026-01-23T01:05:54.119400316Z" level=info msg="CreateContainer within sandbox \"f7cd0ac8404812ceb44eef2199e5a8bd4f512746fcafef6eb5d4b6094e37c16f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 23 01:05:54.122608 containerd[1649]: time="2026-01-23T01:05:54.122581903Z" level=info msg="CreateContainer within sandbox \"c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 23 01:05:54.142926 containerd[1649]: time="2026-01-23T01:05:54.142893793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-n-615049e46b,Uid:60fec7243a1c75d328bc4b48ffe4f935,Namespace:kube-system,Attempt:0,} returns sandbox id \"063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3\"" Jan 23 01:05:54.146752 containerd[1649]: time="2026-01-23T01:05:54.146711200Z" level=info msg="Container 86d3c0ebb6528313d551caf232237fca219c95df9a1acd892e0fdf0d9c4624d8: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:05:54.148413 containerd[1649]: time="2026-01-23T01:05:54.148390019Z" level=info msg="CreateContainer within sandbox \"063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 23 01:05:54.151387 containerd[1649]: time="2026-01-23T01:05:54.151291199Z" level=info msg="Container 1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:05:54.157748 containerd[1649]: time="2026-01-23T01:05:54.157678360Z" level=info msg="CreateContainer within sandbox \"f7cd0ac8404812ceb44eef2199e5a8bd4f512746fcafef6eb5d4b6094e37c16f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"86d3c0ebb6528313d551caf232237fca219c95df9a1acd892e0fdf0d9c4624d8\"" Jan 23 01:05:54.158530 containerd[1649]: time="2026-01-23T01:05:54.158496615Z" level=info msg="StartContainer for \"86d3c0ebb6528313d551caf232237fca219c95df9a1acd892e0fdf0d9c4624d8\"" Jan 23 01:05:54.159335 containerd[1649]: time="2026-01-23T01:05:54.159311594Z" level=info msg="connecting to shim 86d3c0ebb6528313d551caf232237fca219c95df9a1acd892e0fdf0d9c4624d8" address="unix:///run/containerd/s/e332426a1f55a5c69c5289e089457d1f8eaf204426184d8fb774a06c34be154a" protocol=ttrpc version=3 Jan 23 01:05:54.163913 containerd[1649]: time="2026-01-23T01:05:54.163883951Z" level=info msg="CreateContainer within sandbox \"c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af\"" Jan 23 01:05:54.164563 containerd[1649]: time="2026-01-23T01:05:54.164219014Z" level=info msg="StartContainer for \"1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af\"" Jan 23 01:05:54.165216 containerd[1649]: time="2026-01-23T01:05:54.165196453Z" level=info msg="connecting to shim 1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af" address="unix:///run/containerd/s/2484a299f42437d70d09f69af8414a9bdfc61bbe51ddde82ef6e3dbb06eaeb2b" protocol=ttrpc version=3 Jan 23 01:05:54.172024 containerd[1649]: time="2026-01-23T01:05:54.172005854Z" level=info msg="Container 7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:05:54.187106 systemd[1]: Started cri-containerd-86d3c0ebb6528313d551caf232237fca219c95df9a1acd892e0fdf0d9c4624d8.scope - libcontainer container 86d3c0ebb6528313d551caf232237fca219c95df9a1acd892e0fdf0d9c4624d8. Jan 23 01:05:54.189230 containerd[1649]: time="2026-01-23T01:05:54.188862701Z" level=info msg="CreateContainer within sandbox \"063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c\"" Jan 23 01:05:54.189713 containerd[1649]: time="2026-01-23T01:05:54.189648466Z" level=info msg="StartContainer for \"7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c\"" Jan 23 01:05:54.191532 systemd[1]: Started cri-containerd-1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af.scope - libcontainer container 1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af. Jan 23 01:05:54.193531 containerd[1649]: time="2026-01-23T01:05:54.193111922Z" level=info msg="connecting to shim 7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c" address="unix:///run/containerd/s/1cff102426f9383e20572203ff4cdc8c69269f3edd5955a4c11bb9dbd611c080" protocol=ttrpc version=3 Jan 23 01:05:54.225691 systemd[1]: Started cri-containerd-7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c.scope - libcontainer container 7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c. Jan 23 01:05:54.270813 containerd[1649]: time="2026-01-23T01:05:54.270771131Z" level=info msg="StartContainer for \"1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af\" returns successfully" Jan 23 01:05:54.278847 containerd[1649]: time="2026-01-23T01:05:54.277675312Z" level=info msg="StartContainer for \"86d3c0ebb6528313d551caf232237fca219c95df9a1acd892e0fdf0d9c4624d8\" returns successfully" Jan 23 01:05:54.341883 containerd[1649]: time="2026-01-23T01:05:54.341848560Z" level=info msg="StartContainer for \"7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c\" returns successfully" Jan 23 01:05:54.404111 kubelet[2503]: E0123 01:05:54.404047 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:54.406271 kubelet[2503]: E0123 01:05:54.406249 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:54.408973 kubelet[2503]: E0123 01:05:54.408455 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:54.550731 kubelet[2503]: I0123 01:05:54.550699 2503 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:55.410171 kubelet[2503]: E0123 01:05:55.410139 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:55.410831 kubelet[2503]: E0123 01:05:55.410816 2503 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:55.997815 kubelet[2503]: E0123 01:05:55.997772 2503 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-n-615049e46b\" not found" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.103097 kubelet[2503]: I0123 01:05:56.103016 2503 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.145839 kubelet[2503]: I0123 01:05:56.145801 2503 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.152527 kubelet[2503]: E0123 01:05:56.152488 2503 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.152527 kubelet[2503]: I0123 01:05:56.152519 2503 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.153856 kubelet[2503]: E0123 01:05:56.153818 2503 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-n-615049e46b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.153856 kubelet[2503]: I0123 01:05:56.153835 2503 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.155026 kubelet[2503]: E0123 01:05:56.155010 2503 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-n-615049e46b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.310705 kubelet[2503]: I0123 01:05:56.310032 2503 apiserver.go:52] "Watching apiserver" Jan 23 01:05:56.344647 kubelet[2503]: I0123 01:05:56.344590 2503 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 23 01:05:56.411398 kubelet[2503]: I0123 01:05:56.411322 2503 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" Jan 23 01:05:56.415683 kubelet[2503]: E0123 01:05:56.415629 2503 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-n-615049e46b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" Jan 23 01:05:57.712759 kubelet[2503]: I0123 01:05:57.712727 2503 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:58.298463 systemd[1]: Reload requested from client PID 2788 ('systemctl') (unit session-9.scope)... Jan 23 01:05:58.298997 systemd[1]: Reloading... Jan 23 01:05:58.447540 zram_generator::config[2831]: No configuration found. Jan 23 01:05:58.653073 systemd[1]: Reloading finished in 353 ms. Jan 23 01:05:58.683264 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:58.700759 systemd[1]: kubelet.service: Deactivated successfully. Jan 23 01:05:58.701132 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:58.701191 systemd[1]: kubelet.service: Consumed 1.004s CPU time, 124.8M memory peak. Jan 23 01:05:58.704955 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 01:05:58.916127 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 01:05:58.922874 (kubelet)[2882]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 01:05:58.961901 kubelet[2882]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 01:05:58.961901 kubelet[2882]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 01:05:58.962537 kubelet[2882]: I0123 01:05:58.961872 2882 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 01:05:58.969589 kubelet[2882]: I0123 01:05:58.969012 2882 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 23 01:05:58.969589 kubelet[2882]: I0123 01:05:58.969030 2882 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 01:05:58.969589 kubelet[2882]: I0123 01:05:58.969054 2882 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 23 01:05:58.969589 kubelet[2882]: I0123 01:05:58.969061 2882 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 01:05:58.969589 kubelet[2882]: I0123 01:05:58.969258 2882 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 01:05:58.970875 kubelet[2882]: I0123 01:05:58.970857 2882 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 23 01:05:58.973473 kubelet[2882]: I0123 01:05:58.973457 2882 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 01:05:58.976492 kubelet[2882]: I0123 01:05:58.976474 2882 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 01:05:58.979683 kubelet[2882]: I0123 01:05:58.979663 2882 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 23 01:05:58.979861 kubelet[2882]: I0123 01:05:58.979822 2882 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 01:05:58.980025 kubelet[2882]: I0123 01:05:58.979843 2882 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-n-615049e46b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 01:05:58.980134 kubelet[2882]: I0123 01:05:58.980031 2882 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 01:05:58.980134 kubelet[2882]: I0123 01:05:58.980041 2882 container_manager_linux.go:306] "Creating device plugin manager" Jan 23 01:05:58.980134 kubelet[2882]: I0123 01:05:58.980062 2882 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 23 01:05:58.981181 kubelet[2882]: I0123 01:05:58.981166 2882 state_mem.go:36] "Initialized new in-memory state store" Jan 23 01:05:58.981309 kubelet[2882]: I0123 01:05:58.981299 2882 kubelet.go:475] "Attempting to sync node with API server" Jan 23 01:05:58.981346 kubelet[2882]: I0123 01:05:58.981314 2882 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 01:05:58.981346 kubelet[2882]: I0123 01:05:58.981339 2882 kubelet.go:387] "Adding apiserver pod source" Jan 23 01:05:58.981399 kubelet[2882]: I0123 01:05:58.981363 2882 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 01:05:58.985657 kubelet[2882]: I0123 01:05:58.985558 2882 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Jan 23 01:05:58.986903 kubelet[2882]: I0123 01:05:58.986363 2882 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 01:05:58.987078 kubelet[2882]: I0123 01:05:58.987069 2882 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 23 01:05:58.991526 kubelet[2882]: I0123 01:05:58.989606 2882 server.go:1262] "Started kubelet" Jan 23 01:05:58.991526 kubelet[2882]: I0123 01:05:58.989852 2882 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 01:05:58.991526 kubelet[2882]: I0123 01:05:58.990445 2882 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 01:05:58.991526 kubelet[2882]: I0123 01:05:58.990568 2882 server.go:310] "Adding debug handlers to kubelet server" Jan 23 01:05:58.993764 kubelet[2882]: I0123 01:05:58.993191 2882 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 01:05:58.993764 kubelet[2882]: I0123 01:05:58.993227 2882 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 23 01:05:58.993764 kubelet[2882]: I0123 01:05:58.993338 2882 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 01:05:59.005065 kubelet[2882]: I0123 01:05:59.005045 2882 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 01:05:59.005534 kubelet[2882]: I0123 01:05:59.005440 2882 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 23 01:05:59.012486 kubelet[2882]: I0123 01:05:59.005663 2882 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 23 01:05:59.012486 kubelet[2882]: E0123 01:05:59.005785 2882 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-615049e46b\" not found" Jan 23 01:05:59.012624 kubelet[2882]: I0123 01:05:59.012609 2882 reconciler.go:29] "Reconciler: start to sync state" Jan 23 01:05:59.016638 kubelet[2882]: I0123 01:05:59.016615 2882 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 01:05:59.019775 kubelet[2882]: I0123 01:05:59.019724 2882 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 23 01:05:59.020642 kubelet[2882]: I0123 01:05:59.020623 2882 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 23 01:05:59.020642 kubelet[2882]: I0123 01:05:59.020639 2882 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 23 01:05:59.020722 kubelet[2882]: I0123 01:05:59.020656 2882 kubelet.go:2427] "Starting kubelet main sync loop" Jan 23 01:05:59.020722 kubelet[2882]: E0123 01:05:59.020688 2882 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 01:05:59.028207 kubelet[2882]: I0123 01:05:59.027777 2882 factory.go:223] Registration of the containerd container factory successfully Jan 23 01:05:59.028207 kubelet[2882]: I0123 01:05:59.027789 2882 factory.go:223] Registration of the systemd container factory successfully Jan 23 01:05:59.124774 kubelet[2882]: E0123 01:05:59.124651 2882 kubelet.go:2451] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 23 01:05:59.137109 kubelet[2882]: I0123 01:05:59.136916 2882 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 01:05:59.137109 kubelet[2882]: I0123 01:05:59.136929 2882 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 01:05:59.137109 kubelet[2882]: I0123 01:05:59.136944 2882 state_mem.go:36] "Initialized new in-memory state store" Jan 23 01:05:59.137568 kubelet[2882]: I0123 01:05:59.137549 2882 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 23 01:05:59.137614 kubelet[2882]: I0123 01:05:59.137569 2882 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 23 01:05:59.137614 kubelet[2882]: I0123 01:05:59.137589 2882 policy_none.go:49] "None policy: Start" Jan 23 01:05:59.137614 kubelet[2882]: I0123 01:05:59.137599 2882 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 23 01:05:59.137614 kubelet[2882]: I0123 01:05:59.137610 2882 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 23 01:05:59.137716 kubelet[2882]: I0123 01:05:59.137702 2882 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Jan 23 01:05:59.137716 kubelet[2882]: I0123 01:05:59.137709 2882 policy_none.go:47] "Start" Jan 23 01:05:59.142416 kubelet[2882]: E0123 01:05:59.142400 2882 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 01:05:59.142558 kubelet[2882]: I0123 01:05:59.142546 2882 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 01:05:59.142600 kubelet[2882]: I0123 01:05:59.142559 2882 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 01:05:59.143339 kubelet[2882]: I0123 01:05:59.143272 2882 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 01:05:59.144308 kubelet[2882]: E0123 01:05:59.144294 2882 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 01:05:59.252984 kubelet[2882]: I0123 01:05:59.252804 2882 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.272811 kubelet[2882]: I0123 01:05:59.272682 2882 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.274324 kubelet[2882]: I0123 01:05:59.274209 2882 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.326876 kubelet[2882]: I0123 01:05:59.326291 2882 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.326876 kubelet[2882]: I0123 01:05:59.326831 2882 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.328194 kubelet[2882]: I0123 01:05:59.327597 2882 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.339146 kubelet[2882]: E0123 01:05:59.338576 2882 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-n-615049e46b\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.426886 kubelet[2882]: I0123 01:05:59.426681 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9308210fbd41571fa9ff2c37ae11c3e8-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-n-615049e46b\" (UID: \"9308210fbd41571fa9ff2c37ae11c3e8\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.426886 kubelet[2882]: I0123 01:05:59.426882 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9308210fbd41571fa9ff2c37ae11c3e8-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-n-615049e46b\" (UID: \"9308210fbd41571fa9ff2c37ae11c3e8\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.427157 kubelet[2882]: I0123 01:05:59.426987 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.427157 kubelet[2882]: I0123 01:05:59.427076 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60fec7243a1c75d328bc4b48ffe4f935-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-n-615049e46b\" (UID: \"60fec7243a1c75d328bc4b48ffe4f935\") " pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.427244 kubelet[2882]: I0123 01:05:59.427160 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9308210fbd41571fa9ff2c37ae11c3e8-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-n-615049e46b\" (UID: \"9308210fbd41571fa9ff2c37ae11c3e8\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.427365 kubelet[2882]: I0123 01:05:59.427332 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.428816 kubelet[2882]: I0123 01:05:59.427537 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.428816 kubelet[2882]: I0123 01:05:59.427786 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.428816 kubelet[2882]: I0123 01:05:59.427927 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5cc9b1ec51b022dcf1e2ac5f7e1a2376-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-n-615049e46b\" (UID: \"5cc9b1ec51b022dcf1e2ac5f7e1a2376\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" Jan 23 01:05:59.988787 kubelet[2882]: I0123 01:05:59.988687 2882 apiserver.go:52] "Watching apiserver" Jan 23 01:06:00.013563 kubelet[2882]: I0123 01:06:00.013440 2882 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 23 01:06:00.092625 kubelet[2882]: I0123 01:06:00.092484 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-615049e46b" podStartSLOduration=1.092461595 podStartE2EDuration="1.092461595s" podCreationTimestamp="2026-01-23 01:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 01:06:00.091250952 +0000 UTC m=+1.163876535" watchObservedRunningTime="2026-01-23 01:06:00.092461595 +0000 UTC m=+1.165087187" Jan 23 01:06:00.112595 kubelet[2882]: I0123 01:06:00.112528 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-n-615049e46b" podStartSLOduration=1.112500001 podStartE2EDuration="1.112500001s" podCreationTimestamp="2026-01-23 01:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 01:06:00.102503792 +0000 UTC m=+1.175129371" watchObservedRunningTime="2026-01-23 01:06:00.112500001 +0000 UTC m=+1.185125584" Jan 23 01:06:00.122967 kubelet[2882]: I0123 01:06:00.122822 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-n-615049e46b" podStartSLOduration=3.122809724 podStartE2EDuration="3.122809724s" podCreationTimestamp="2026-01-23 01:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 01:06:00.111811945 +0000 UTC m=+1.184437549" watchObservedRunningTime="2026-01-23 01:06:00.122809724 +0000 UTC m=+1.195435298" Jan 23 01:06:03.790441 kubelet[2882]: I0123 01:06:03.790387 2882 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 23 01:06:03.791507 containerd[1649]: time="2026-01-23T01:06:03.791133614Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 23 01:06:03.792335 kubelet[2882]: I0123 01:06:03.791943 2882 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 23 01:06:04.659605 systemd[1]: Created slice kubepods-besteffort-pod23816369_b1d8_4f11_8fd3_02c3a085eeea.slice - libcontainer container kubepods-besteffort-pod23816369_b1d8_4f11_8fd3_02c3a085eeea.slice. Jan 23 01:06:04.665422 kubelet[2882]: I0123 01:06:04.664101 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/23816369-b1d8-4f11-8fd3-02c3a085eeea-kube-proxy\") pod \"kube-proxy-pgntr\" (UID: \"23816369-b1d8-4f11-8fd3-02c3a085eeea\") " pod="kube-system/kube-proxy-pgntr" Jan 23 01:06:04.665422 kubelet[2882]: I0123 01:06:04.664150 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpjv\" (UniqueName: \"kubernetes.io/projected/23816369-b1d8-4f11-8fd3-02c3a085eeea-kube-api-access-7bpjv\") pod \"kube-proxy-pgntr\" (UID: \"23816369-b1d8-4f11-8fd3-02c3a085eeea\") " pod="kube-system/kube-proxy-pgntr" Jan 23 01:06:04.665422 kubelet[2882]: I0123 01:06:04.664200 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/23816369-b1d8-4f11-8fd3-02c3a085eeea-xtables-lock\") pod \"kube-proxy-pgntr\" (UID: \"23816369-b1d8-4f11-8fd3-02c3a085eeea\") " pod="kube-system/kube-proxy-pgntr" Jan 23 01:06:04.665422 kubelet[2882]: I0123 01:06:04.664231 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23816369-b1d8-4f11-8fd3-02c3a085eeea-lib-modules\") pod \"kube-proxy-pgntr\" (UID: \"23816369-b1d8-4f11-8fd3-02c3a085eeea\") " pod="kube-system/kube-proxy-pgntr" Jan 23 01:06:04.977315 containerd[1649]: time="2026-01-23T01:06:04.977079576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pgntr,Uid:23816369-b1d8-4f11-8fd3-02c3a085eeea,Namespace:kube-system,Attempt:0,}" Jan 23 01:06:05.009228 containerd[1649]: time="2026-01-23T01:06:05.008984355Z" level=info msg="connecting to shim 8988b6b68f4b0743807d17e901da7458431a082765b1d87ef1c054f5b86070d2" address="unix:///run/containerd/s/0de116bc4f47fb1baa60094cebc7cd91196a5805a945992e2547fabf7447e710" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:05.038893 systemd[1]: Created slice kubepods-besteffort-pod29366d18_648b_4c03_8572_11254e538738.slice - libcontainer container kubepods-besteffort-pod29366d18_648b_4c03_8572_11254e538738.slice. Jan 23 01:06:05.059696 systemd[1]: Started cri-containerd-8988b6b68f4b0743807d17e901da7458431a082765b1d87ef1c054f5b86070d2.scope - libcontainer container 8988b6b68f4b0743807d17e901da7458431a082765b1d87ef1c054f5b86070d2. Jan 23 01:06:05.067711 kubelet[2882]: I0123 01:06:05.067650 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-726lb\" (UniqueName: \"kubernetes.io/projected/29366d18-648b-4c03-8572-11254e538738-kube-api-access-726lb\") pod \"tigera-operator-65cdcdfd6d-k9zm2\" (UID: \"29366d18-648b-4c03-8572-11254e538738\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-k9zm2" Jan 23 01:06:05.068291 kubelet[2882]: I0123 01:06:05.068073 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/29366d18-648b-4c03-8572-11254e538738-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-k9zm2\" (UID: \"29366d18-648b-4c03-8572-11254e538738\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-k9zm2" Jan 23 01:06:05.086320 containerd[1649]: time="2026-01-23T01:06:05.086240196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pgntr,Uid:23816369-b1d8-4f11-8fd3-02c3a085eeea,Namespace:kube-system,Attempt:0,} returns sandbox id \"8988b6b68f4b0743807d17e901da7458431a082765b1d87ef1c054f5b86070d2\"" Jan 23 01:06:05.092422 containerd[1649]: time="2026-01-23T01:06:05.092330024Z" level=info msg="CreateContainer within sandbox \"8988b6b68f4b0743807d17e901da7458431a082765b1d87ef1c054f5b86070d2\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 23 01:06:05.109582 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3436484822.mount: Deactivated successfully. Jan 23 01:06:05.111541 containerd[1649]: time="2026-01-23T01:06:05.110981239Z" level=info msg="Container f68a9d3e6bc5ccd88aa20a518760de9d2abb4ac6505eec84c42a7535af384f8b: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:05.124925 containerd[1649]: time="2026-01-23T01:06:05.124871937Z" level=info msg="CreateContainer within sandbox \"8988b6b68f4b0743807d17e901da7458431a082765b1d87ef1c054f5b86070d2\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f68a9d3e6bc5ccd88aa20a518760de9d2abb4ac6505eec84c42a7535af384f8b\"" Jan 23 01:06:05.127534 containerd[1649]: time="2026-01-23T01:06:05.125697973Z" level=info msg="StartContainer for \"f68a9d3e6bc5ccd88aa20a518760de9d2abb4ac6505eec84c42a7535af384f8b\"" Jan 23 01:06:05.127534 containerd[1649]: time="2026-01-23T01:06:05.126794686Z" level=info msg="connecting to shim f68a9d3e6bc5ccd88aa20a518760de9d2abb4ac6505eec84c42a7535af384f8b" address="unix:///run/containerd/s/0de116bc4f47fb1baa60094cebc7cd91196a5805a945992e2547fabf7447e710" protocol=ttrpc version=3 Jan 23 01:06:05.147667 systemd[1]: Started cri-containerd-f68a9d3e6bc5ccd88aa20a518760de9d2abb4ac6505eec84c42a7535af384f8b.scope - libcontainer container f68a9d3e6bc5ccd88aa20a518760de9d2abb4ac6505eec84c42a7535af384f8b. Jan 23 01:06:05.218539 containerd[1649]: time="2026-01-23T01:06:05.217911717Z" level=info msg="StartContainer for \"f68a9d3e6bc5ccd88aa20a518760de9d2abb4ac6505eec84c42a7535af384f8b\" returns successfully" Jan 23 01:06:05.346581 containerd[1649]: time="2026-01-23T01:06:05.346482906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-k9zm2,Uid:29366d18-648b-4c03-8572-11254e538738,Namespace:tigera-operator,Attempt:0,}" Jan 23 01:06:05.373139 containerd[1649]: time="2026-01-23T01:06:05.373069525Z" level=info msg="connecting to shim 1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b" address="unix:///run/containerd/s/e67d63aca0acd5730f46303c3a803929322950099b5ba83205749c67d3c853b3" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:05.404774 systemd[1]: Started cri-containerd-1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b.scope - libcontainer container 1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b. Jan 23 01:06:05.476591 containerd[1649]: time="2026-01-23T01:06:05.476507471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-k9zm2,Uid:29366d18-648b-4c03-8572-11254e538738,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b\"" Jan 23 01:06:05.479335 containerd[1649]: time="2026-01-23T01:06:05.479296205Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 23 01:06:07.365491 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount878472707.mount: Deactivated successfully. Jan 23 01:06:07.791400 containerd[1649]: time="2026-01-23T01:06:07.791303874Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:07.792683 containerd[1649]: time="2026-01-23T01:06:07.792540284Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Jan 23 01:06:07.794090 containerd[1649]: time="2026-01-23T01:06:07.794071248Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:07.796927 containerd[1649]: time="2026-01-23T01:06:07.796885477Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:07.797845 containerd[1649]: time="2026-01-23T01:06:07.797634071Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.318209715s" Jan 23 01:06:07.797845 containerd[1649]: time="2026-01-23T01:06:07.797661434Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 23 01:06:07.802372 containerd[1649]: time="2026-01-23T01:06:07.801840693Z" level=info msg="CreateContainer within sandbox \"1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 23 01:06:07.814273 containerd[1649]: time="2026-01-23T01:06:07.814249981Z" level=info msg="Container ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:07.823048 containerd[1649]: time="2026-01-23T01:06:07.823017136Z" level=info msg="CreateContainer within sandbox \"1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31\"" Jan 23 01:06:07.823538 containerd[1649]: time="2026-01-23T01:06:07.823477869Z" level=info msg="StartContainer for \"ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31\"" Jan 23 01:06:07.825232 containerd[1649]: time="2026-01-23T01:06:07.825211114Z" level=info msg="connecting to shim ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31" address="unix:///run/containerd/s/e67d63aca0acd5730f46303c3a803929322950099b5ba83205749c67d3c853b3" protocol=ttrpc version=3 Jan 23 01:06:07.843656 systemd[1]: Started cri-containerd-ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31.scope - libcontainer container ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31. Jan 23 01:06:07.875779 containerd[1649]: time="2026-01-23T01:06:07.875738278Z" level=info msg="StartContainer for \"ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31\" returns successfully" Jan 23 01:06:08.096868 kubelet[2882]: I0123 01:06:08.096780 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-pgntr" podStartSLOduration=4.096762502 podStartE2EDuration="4.096762502s" podCreationTimestamp="2026-01-23 01:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 01:06:06.087222864 +0000 UTC m=+7.159848447" watchObservedRunningTime="2026-01-23 01:06:08.096762502 +0000 UTC m=+9.169388096" Jan 23 01:06:11.099828 kubelet[2882]: I0123 01:06:11.099340 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-k9zm2" podStartSLOduration=4.778989488 podStartE2EDuration="7.099322504s" podCreationTimestamp="2026-01-23 01:06:04 +0000 UTC" firstStartedPulling="2026-01-23 01:06:05.478105358 +0000 UTC m=+6.550730914" lastFinishedPulling="2026-01-23 01:06:07.798438374 +0000 UTC m=+8.871063930" observedRunningTime="2026-01-23 01:06:08.0973578 +0000 UTC m=+9.169983402" watchObservedRunningTime="2026-01-23 01:06:11.099322504 +0000 UTC m=+12.171948086" Jan 23 01:06:13.602864 sudo[1917]: pam_unix(sudo:session): session closed for user root Jan 23 01:06:13.701362 sshd[1916]: Connection closed by 20.161.92.111 port 39248 Jan 23 01:06:13.704719 sshd-session[1913]: pam_unix(sshd:session): session closed for user core Jan 23 01:06:13.707953 systemd[1]: sshd@8-10.0.5.114:22-20.161.92.111:39248.service: Deactivated successfully. Jan 23 01:06:13.713309 systemd[1]: session-9.scope: Deactivated successfully. Jan 23 01:06:13.713994 systemd[1]: session-9.scope: Consumed 5.370s CPU time, 229M memory peak. Jan 23 01:06:13.716170 systemd-logind[1625]: Session 9 logged out. Waiting for processes to exit. Jan 23 01:06:13.719988 systemd-logind[1625]: Removed session 9. Jan 23 01:06:17.939175 systemd[1]: Created slice kubepods-besteffort-pod48acdb87_70df_4d3a_a335_372f69f3372e.slice - libcontainer container kubepods-besteffort-pod48acdb87_70df_4d3a_a335_372f69f3372e.slice. Jan 23 01:06:18.049036 kubelet[2882]: I0123 01:06:18.048966 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/48acdb87-70df-4d3a-a335-372f69f3372e-typha-certs\") pod \"calico-typha-b666cc779-6djgh\" (UID: \"48acdb87-70df-4d3a-a335-372f69f3372e\") " pod="calico-system/calico-typha-b666cc779-6djgh" Jan 23 01:06:18.049036 kubelet[2882]: I0123 01:06:18.049019 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2g4\" (UniqueName: \"kubernetes.io/projected/48acdb87-70df-4d3a-a335-372f69f3372e-kube-api-access-zd2g4\") pod \"calico-typha-b666cc779-6djgh\" (UID: \"48acdb87-70df-4d3a-a335-372f69f3372e\") " pod="calico-system/calico-typha-b666cc779-6djgh" Jan 23 01:06:18.049036 kubelet[2882]: I0123 01:06:18.049047 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48acdb87-70df-4d3a-a335-372f69f3372e-tigera-ca-bundle\") pod \"calico-typha-b666cc779-6djgh\" (UID: \"48acdb87-70df-4d3a-a335-372f69f3372e\") " pod="calico-system/calico-typha-b666cc779-6djgh" Jan 23 01:06:18.178092 systemd[1]: Created slice kubepods-besteffort-podc653a68f_2222_446d_95a2_4c0f5a91fab6.slice - libcontainer container kubepods-besteffort-podc653a68f_2222_446d_95a2_4c0f5a91fab6.slice. Jan 23 01:06:18.247552 containerd[1649]: time="2026-01-23T01:06:18.247276721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b666cc779-6djgh,Uid:48acdb87-70df-4d3a-a335-372f69f3372e,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:18.251219 kubelet[2882]: I0123 01:06:18.251105 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-cni-net-dir\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.251219 kubelet[2882]: I0123 01:06:18.251165 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-flexvol-driver-host\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.251219 kubelet[2882]: I0123 01:06:18.251188 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-policysync\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.251602 kubelet[2882]: I0123 01:06:18.251442 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-var-run-calico\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.251602 kubelet[2882]: I0123 01:06:18.251478 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-cni-bin-dir\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.251602 kubelet[2882]: I0123 01:06:18.251496 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c653a68f-2222-446d-95a2-4c0f5a91fab6-node-certs\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.251602 kubelet[2882]: I0123 01:06:18.251549 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-xtables-lock\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.251602 kubelet[2882]: I0123 01:06:18.251568 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxc8j\" (UniqueName: \"kubernetes.io/projected/c653a68f-2222-446d-95a2-4c0f5a91fab6-kube-api-access-vxc8j\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.252028 kubelet[2882]: I0123 01:06:18.251719 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-lib-modules\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.252028 kubelet[2882]: I0123 01:06:18.251741 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-cni-log-dir\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.252028 kubelet[2882]: I0123 01:06:18.251758 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c653a68f-2222-446d-95a2-4c0f5a91fab6-tigera-ca-bundle\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.252028 kubelet[2882]: I0123 01:06:18.251775 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c653a68f-2222-446d-95a2-4c0f5a91fab6-var-lib-calico\") pod \"calico-node-p8jwb\" (UID: \"c653a68f-2222-446d-95a2-4c0f5a91fab6\") " pod="calico-system/calico-node-p8jwb" Jan 23 01:06:18.280463 containerd[1649]: time="2026-01-23T01:06:18.280117899Z" level=info msg="connecting to shim f8a264eb4c6e0a3ed937e4f29cba5278b128e7cd271381327833235a6abf77fd" address="unix:///run/containerd/s/21ffeeb0e0cac90bee38dcb4dff81e37c8e5271a9f58029b7af3bf3061b53789" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:18.313659 systemd[1]: Started cri-containerd-f8a264eb4c6e0a3ed937e4f29cba5278b128e7cd271381327833235a6abf77fd.scope - libcontainer container f8a264eb4c6e0a3ed937e4f29cba5278b128e7cd271381327833235a6abf77fd. Jan 23 01:06:18.354183 kubelet[2882]: E0123 01:06:18.353997 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:18.363108 kubelet[2882]: E0123 01:06:18.362589 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.363108 kubelet[2882]: W0123 01:06:18.362855 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.363108 kubelet[2882]: E0123 01:06:18.362887 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.363795 kubelet[2882]: E0123 01:06:18.363475 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.363795 kubelet[2882]: W0123 01:06:18.363485 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.363795 kubelet[2882]: E0123 01:06:18.363494 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.365288 kubelet[2882]: E0123 01:06:18.364549 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.365288 kubelet[2882]: W0123 01:06:18.364561 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.365288 kubelet[2882]: E0123 01:06:18.364572 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.365288 kubelet[2882]: E0123 01:06:18.364875 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.365288 kubelet[2882]: W0123 01:06:18.364883 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.365288 kubelet[2882]: E0123 01:06:18.364891 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.365832 kubelet[2882]: E0123 01:06:18.365820 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.365832 kubelet[2882]: W0123 01:06:18.365831 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.366172 kubelet[2882]: E0123 01:06:18.365841 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.366172 kubelet[2882]: E0123 01:06:18.365979 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.366172 kubelet[2882]: W0123 01:06:18.365984 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.366172 kubelet[2882]: E0123 01:06:18.365991 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.366172 kubelet[2882]: E0123 01:06:18.366125 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.366172 kubelet[2882]: W0123 01:06:18.366130 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.366172 kubelet[2882]: E0123 01:06:18.366136 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.367607 kubelet[2882]: E0123 01:06:18.366260 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.367607 kubelet[2882]: W0123 01:06:18.366265 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.367607 kubelet[2882]: E0123 01:06:18.366271 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.367607 kubelet[2882]: E0123 01:06:18.366378 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.367607 kubelet[2882]: W0123 01:06:18.366383 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.367607 kubelet[2882]: E0123 01:06:18.366388 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.367607 kubelet[2882]: E0123 01:06:18.366559 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.367607 kubelet[2882]: W0123 01:06:18.366566 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.367607 kubelet[2882]: E0123 01:06:18.366572 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.367607 kubelet[2882]: E0123 01:06:18.366678 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.367835 kubelet[2882]: W0123 01:06:18.366683 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.367835 kubelet[2882]: E0123 01:06:18.366689 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.367835 kubelet[2882]: E0123 01:06:18.366819 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.367835 kubelet[2882]: W0123 01:06:18.366824 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.367835 kubelet[2882]: E0123 01:06:18.366829 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.367835 kubelet[2882]: E0123 01:06:18.366939 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.367835 kubelet[2882]: W0123 01:06:18.366945 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.367835 kubelet[2882]: E0123 01:06:18.366951 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.367835 kubelet[2882]: E0123 01:06:18.367060 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.367835 kubelet[2882]: W0123 01:06:18.367065 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.368030 kubelet[2882]: E0123 01:06:18.367071 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.368030 kubelet[2882]: E0123 01:06:18.367253 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.368030 kubelet[2882]: W0123 01:06:18.367259 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.368030 kubelet[2882]: E0123 01:06:18.367269 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.368030 kubelet[2882]: E0123 01:06:18.367407 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.368030 kubelet[2882]: W0123 01:06:18.367412 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.368030 kubelet[2882]: E0123 01:06:18.367418 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.368030 kubelet[2882]: E0123 01:06:18.367530 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.368030 kubelet[2882]: W0123 01:06:18.367535 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.368030 kubelet[2882]: E0123 01:06:18.367540 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.368210 kubelet[2882]: E0123 01:06:18.367715 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.368210 kubelet[2882]: W0123 01:06:18.367721 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.368210 kubelet[2882]: E0123 01:06:18.367727 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.374948 kubelet[2882]: E0123 01:06:18.374924 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.374948 kubelet[2882]: W0123 01:06:18.374939 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.374948 kubelet[2882]: E0123 01:06:18.374952 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.413284 containerd[1649]: time="2026-01-23T01:06:18.413246645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b666cc779-6djgh,Uid:48acdb87-70df-4d3a-a335-372f69f3372e,Namespace:calico-system,Attempt:0,} returns sandbox id \"f8a264eb4c6e0a3ed937e4f29cba5278b128e7cd271381327833235a6abf77fd\"" Jan 23 01:06:18.415604 containerd[1649]: time="2026-01-23T01:06:18.414571628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 23 01:06:18.428474 kubelet[2882]: E0123 01:06:18.428455 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.428755 kubelet[2882]: W0123 01:06:18.428579 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.428755 kubelet[2882]: E0123 01:06:18.428599 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.429128 kubelet[2882]: E0123 01:06:18.428965 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.429128 kubelet[2882]: W0123 01:06:18.428974 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.429128 kubelet[2882]: E0123 01:06:18.428984 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.429493 kubelet[2882]: E0123 01:06:18.429377 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.429493 kubelet[2882]: W0123 01:06:18.429387 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.429493 kubelet[2882]: E0123 01:06:18.429402 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.429625 kubelet[2882]: E0123 01:06:18.429619 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.429684 kubelet[2882]: W0123 01:06:18.429678 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.429720 kubelet[2882]: E0123 01:06:18.429714 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.429903 kubelet[2882]: E0123 01:06:18.429889 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.430007 kubelet[2882]: W0123 01:06:18.429896 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.430007 kubelet[2882]: E0123 01:06:18.429951 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.430090 kubelet[2882]: E0123 01:06:18.430084 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.430195 kubelet[2882]: W0123 01:06:18.430122 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.430195 kubelet[2882]: E0123 01:06:18.430138 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.430275 kubelet[2882]: E0123 01:06:18.430270 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.430381 kubelet[2882]: W0123 01:06:18.430308 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.430381 kubelet[2882]: E0123 01:06:18.430315 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.430527 kubelet[2882]: E0123 01:06:18.430465 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.430527 kubelet[2882]: W0123 01:06:18.430472 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.430527 kubelet[2882]: E0123 01:06:18.430478 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.430768 kubelet[2882]: E0123 01:06:18.430718 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.430768 kubelet[2882]: W0123 01:06:18.430726 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.430768 kubelet[2882]: E0123 01:06:18.430733 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.430941 kubelet[2882]: E0123 01:06:18.430935 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.430989 kubelet[2882]: W0123 01:06:18.430969 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.430989 kubelet[2882]: E0123 01:06:18.430977 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.431188 kubelet[2882]: E0123 01:06:18.431143 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.431188 kubelet[2882]: W0123 01:06:18.431149 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.431188 kubelet[2882]: E0123 01:06:18.431155 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.431452 kubelet[2882]: E0123 01:06:18.431379 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.431452 kubelet[2882]: W0123 01:06:18.431385 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.431452 kubelet[2882]: E0123 01:06:18.431392 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.431621 kubelet[2882]: E0123 01:06:18.431616 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.431654 kubelet[2882]: W0123 01:06:18.431649 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.431862 kubelet[2882]: E0123 01:06:18.431815 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.432121 kubelet[2882]: E0123 01:06:18.432071 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.432121 kubelet[2882]: W0123 01:06:18.432078 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.432121 kubelet[2882]: E0123 01:06:18.432084 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.432370 kubelet[2882]: E0123 01:06:18.432310 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.432370 kubelet[2882]: W0123 01:06:18.432317 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.432370 kubelet[2882]: E0123 01:06:18.432323 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.432849 kubelet[2882]: E0123 01:06:18.432765 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.432849 kubelet[2882]: W0123 01:06:18.432774 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.432849 kubelet[2882]: E0123 01:06:18.432781 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.433023 kubelet[2882]: E0123 01:06:18.432973 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.433023 kubelet[2882]: W0123 01:06:18.432980 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.433023 kubelet[2882]: E0123 01:06:18.432987 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.433246 kubelet[2882]: E0123 01:06:18.433189 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.433246 kubelet[2882]: W0123 01:06:18.433196 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.433246 kubelet[2882]: E0123 01:06:18.433203 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.433405 kubelet[2882]: E0123 01:06:18.433400 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.433442 kubelet[2882]: W0123 01:06:18.433437 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.433530 kubelet[2882]: E0123 01:06:18.433472 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.433655 kubelet[2882]: E0123 01:06:18.433650 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.433688 kubelet[2882]: W0123 01:06:18.433683 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.433778 kubelet[2882]: E0123 01:06:18.433726 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.454089 kubelet[2882]: E0123 01:06:18.454065 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.454089 kubelet[2882]: W0123 01:06:18.454083 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.454256 kubelet[2882]: E0123 01:06:18.454109 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.454256 kubelet[2882]: I0123 01:06:18.454132 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bfc1d39e-fed0-4ad0-8a64-aa0c649c314e-registration-dir\") pod \"csi-node-driver-p7kwl\" (UID: \"bfc1d39e-fed0-4ad0-8a64-aa0c649c314e\") " pod="calico-system/csi-node-driver-p7kwl" Jan 23 01:06:18.454442 kubelet[2882]: E0123 01:06:18.454337 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.454442 kubelet[2882]: W0123 01:06:18.454344 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.454442 kubelet[2882]: E0123 01:06:18.454352 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.454442 kubelet[2882]: I0123 01:06:18.454364 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfc1d39e-fed0-4ad0-8a64-aa0c649c314e-kubelet-dir\") pod \"csi-node-driver-p7kwl\" (UID: \"bfc1d39e-fed0-4ad0-8a64-aa0c649c314e\") " pod="calico-system/csi-node-driver-p7kwl" Jan 23 01:06:18.454570 kubelet[2882]: E0123 01:06:18.454492 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.454570 kubelet[2882]: W0123 01:06:18.454499 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.454570 kubelet[2882]: E0123 01:06:18.454505 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.454570 kubelet[2882]: I0123 01:06:18.454544 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bfc1d39e-fed0-4ad0-8a64-aa0c649c314e-socket-dir\") pod \"csi-node-driver-p7kwl\" (UID: \"bfc1d39e-fed0-4ad0-8a64-aa0c649c314e\") " pod="calico-system/csi-node-driver-p7kwl" Jan 23 01:06:18.454706 kubelet[2882]: E0123 01:06:18.454692 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.454706 kubelet[2882]: W0123 01:06:18.454701 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.454830 kubelet[2882]: E0123 01:06:18.454707 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.454830 kubelet[2882]: I0123 01:06:18.454724 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/bfc1d39e-fed0-4ad0-8a64-aa0c649c314e-varrun\") pod \"csi-node-driver-p7kwl\" (UID: \"bfc1d39e-fed0-4ad0-8a64-aa0c649c314e\") " pod="calico-system/csi-node-driver-p7kwl" Jan 23 01:06:18.455154 kubelet[2882]: E0123 01:06:18.455058 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.455154 kubelet[2882]: W0123 01:06:18.455072 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.455154 kubelet[2882]: E0123 01:06:18.455084 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.455338 kubelet[2882]: E0123 01:06:18.455312 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.455338 kubelet[2882]: W0123 01:06:18.455323 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.455338 kubelet[2882]: E0123 01:06:18.455330 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.455887 kubelet[2882]: E0123 01:06:18.455867 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.455887 kubelet[2882]: W0123 01:06:18.455873 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.455887 kubelet[2882]: E0123 01:06:18.455879 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.456070 kubelet[2882]: E0123 01:06:18.456060 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.456070 kubelet[2882]: W0123 01:06:18.456068 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.456116 kubelet[2882]: E0123 01:06:18.456076 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.456116 kubelet[2882]: I0123 01:06:18.456110 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxlz\" (UniqueName: \"kubernetes.io/projected/bfc1d39e-fed0-4ad0-8a64-aa0c649c314e-kube-api-access-9pxlz\") pod \"csi-node-driver-p7kwl\" (UID: \"bfc1d39e-fed0-4ad0-8a64-aa0c649c314e\") " pod="calico-system/csi-node-driver-p7kwl" Jan 23 01:06:18.456404 kubelet[2882]: E0123 01:06:18.456384 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.456404 kubelet[2882]: W0123 01:06:18.456398 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.456468 kubelet[2882]: E0123 01:06:18.456407 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.456761 kubelet[2882]: E0123 01:06:18.456747 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.456761 kubelet[2882]: W0123 01:06:18.456758 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.456927 kubelet[2882]: E0123 01:06:18.456767 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.456954 kubelet[2882]: E0123 01:06:18.456942 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.456954 kubelet[2882]: W0123 01:06:18.456948 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.456996 kubelet[2882]: E0123 01:06:18.456954 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.457122 kubelet[2882]: E0123 01:06:18.457105 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.457150 kubelet[2882]: W0123 01:06:18.457124 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.457150 kubelet[2882]: E0123 01:06:18.457131 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.457693 kubelet[2882]: E0123 01:06:18.457679 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.457693 kubelet[2882]: W0123 01:06:18.457692 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.457771 kubelet[2882]: E0123 01:06:18.457700 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.458042 kubelet[2882]: E0123 01:06:18.458030 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.458042 kubelet[2882]: W0123 01:06:18.458040 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.458098 kubelet[2882]: E0123 01:06:18.458048 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.458248 kubelet[2882]: E0123 01:06:18.458238 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.458248 kubelet[2882]: W0123 01:06:18.458247 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.458297 kubelet[2882]: E0123 01:06:18.458255 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.485683 containerd[1649]: time="2026-01-23T01:06:18.485581142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-p8jwb,Uid:c653a68f-2222-446d-95a2-4c0f5a91fab6,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:18.512627 containerd[1649]: time="2026-01-23T01:06:18.512541844Z" level=info msg="connecting to shim e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42" address="unix:///run/containerd/s/41c0abe3e30297158553a0b66978e82dd0d2a7cdb7498e2e6c20c863e557cf3d" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:18.541671 systemd[1]: Started cri-containerd-e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42.scope - libcontainer container e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42. Jan 23 01:06:18.558668 kubelet[2882]: E0123 01:06:18.558641 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.559177 kubelet[2882]: W0123 01:06:18.559157 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.559219 kubelet[2882]: E0123 01:06:18.559180 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.559440 kubelet[2882]: E0123 01:06:18.559428 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.559440 kubelet[2882]: W0123 01:06:18.559439 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.559500 kubelet[2882]: E0123 01:06:18.559460 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.559947 kubelet[2882]: E0123 01:06:18.559640 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.559947 kubelet[2882]: W0123 01:06:18.559649 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.559947 kubelet[2882]: E0123 01:06:18.559656 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.560034 kubelet[2882]: E0123 01:06:18.559964 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.560034 kubelet[2882]: W0123 01:06:18.559971 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.560034 kubelet[2882]: E0123 01:06:18.559978 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.560363 kubelet[2882]: E0123 01:06:18.560253 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.560363 kubelet[2882]: W0123 01:06:18.560263 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.560363 kubelet[2882]: E0123 01:06:18.560270 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.560882 kubelet[2882]: E0123 01:06:18.560524 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.560882 kubelet[2882]: W0123 01:06:18.560532 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.560882 kubelet[2882]: E0123 01:06:18.560538 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.561212 kubelet[2882]: E0123 01:06:18.561117 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.561212 kubelet[2882]: W0123 01:06:18.561129 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.561212 kubelet[2882]: E0123 01:06:18.561138 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.562850 kubelet[2882]: E0123 01:06:18.561507 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.562850 kubelet[2882]: W0123 01:06:18.562557 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.562850 kubelet[2882]: E0123 01:06:18.562568 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.563185 kubelet[2882]: E0123 01:06:18.563174 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.563185 kubelet[2882]: W0123 01:06:18.563184 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.563244 kubelet[2882]: E0123 01:06:18.563192 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.563782 kubelet[2882]: E0123 01:06:18.563768 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.563782 kubelet[2882]: W0123 01:06:18.563780 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.563864 kubelet[2882]: E0123 01:06:18.563789 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.564534 kubelet[2882]: E0123 01:06:18.564069 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.564534 kubelet[2882]: W0123 01:06:18.564079 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.564534 kubelet[2882]: E0123 01:06:18.564088 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.564534 kubelet[2882]: E0123 01:06:18.564236 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.564534 kubelet[2882]: W0123 01:06:18.564242 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.564534 kubelet[2882]: E0123 01:06:18.564248 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.564871 kubelet[2882]: E0123 01:06:18.564860 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.564871 kubelet[2882]: W0123 01:06:18.564871 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.564925 kubelet[2882]: E0123 01:06:18.564879 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.565114 kubelet[2882]: E0123 01:06:18.565103 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.565269 kubelet[2882]: W0123 01:06:18.565187 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.565269 kubelet[2882]: E0123 01:06:18.565244 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.565475 kubelet[2882]: E0123 01:06:18.565396 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.565475 kubelet[2882]: W0123 01:06:18.565402 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.565475 kubelet[2882]: E0123 01:06:18.565408 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.565774 kubelet[2882]: E0123 01:06:18.565583 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.565774 kubelet[2882]: W0123 01:06:18.565588 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.565774 kubelet[2882]: E0123 01:06:18.565595 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.565774 kubelet[2882]: E0123 01:06:18.565741 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.565774 kubelet[2882]: W0123 01:06:18.565747 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.565774 kubelet[2882]: E0123 01:06:18.565753 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.566569 kubelet[2882]: E0123 01:06:18.565996 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.566569 kubelet[2882]: W0123 01:06:18.566035 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.566569 kubelet[2882]: E0123 01:06:18.566043 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.566569 kubelet[2882]: E0123 01:06:18.566533 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.566569 kubelet[2882]: W0123 01:06:18.566540 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.566569 kubelet[2882]: E0123 01:06:18.566548 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.566828 kubelet[2882]: E0123 01:06:18.566815 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.566924 kubelet[2882]: W0123 01:06:18.566914 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.566948 kubelet[2882]: E0123 01:06:18.566929 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.567238 kubelet[2882]: E0123 01:06:18.567226 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.567238 kubelet[2882]: W0123 01:06:18.567237 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.567292 kubelet[2882]: E0123 01:06:18.567245 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.567562 kubelet[2882]: E0123 01:06:18.567551 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.567562 kubelet[2882]: W0123 01:06:18.567561 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.567622 kubelet[2882]: E0123 01:06:18.567579 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.567717 kubelet[2882]: E0123 01:06:18.567709 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.567746 kubelet[2882]: W0123 01:06:18.567726 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.567746 kubelet[2882]: E0123 01:06:18.567732 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.567847 kubelet[2882]: E0123 01:06:18.567839 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.567847 kubelet[2882]: W0123 01:06:18.567846 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.567907 kubelet[2882]: E0123 01:06:18.567852 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.568285 kubelet[2882]: E0123 01:06:18.568181 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.568285 kubelet[2882]: W0123 01:06:18.568229 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.568285 kubelet[2882]: E0123 01:06:18.568237 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.575648 kubelet[2882]: E0123 01:06:18.575635 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:18.575846 kubelet[2882]: W0123 01:06:18.575729 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:18.575846 kubelet[2882]: E0123 01:06:18.575745 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:18.576034 containerd[1649]: time="2026-01-23T01:06:18.575965643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-p8jwb,Uid:c653a68f-2222-446d-95a2-4c0f5a91fab6,Namespace:calico-system,Attempt:0,} returns sandbox id \"e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42\"" Jan 23 01:06:19.892687 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount468659419.mount: Deactivated successfully. Jan 23 01:06:20.021617 kubelet[2882]: E0123 01:06:20.021504 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:21.115138 containerd[1649]: time="2026-01-23T01:06:21.114640256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:21.115940 containerd[1649]: time="2026-01-23T01:06:21.115922341Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Jan 23 01:06:21.119538 containerd[1649]: time="2026-01-23T01:06:21.119501033Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:21.121641 containerd[1649]: time="2026-01-23T01:06:21.121618256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:21.122421 containerd[1649]: time="2026-01-23T01:06:21.122398209Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.706791512s" Jan 23 01:06:21.122450 containerd[1649]: time="2026-01-23T01:06:21.122420025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 23 01:06:21.123524 containerd[1649]: time="2026-01-23T01:06:21.123219354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 23 01:06:21.138071 containerd[1649]: time="2026-01-23T01:06:21.138041432Z" level=info msg="CreateContainer within sandbox \"f8a264eb4c6e0a3ed937e4f29cba5278b128e7cd271381327833235a6abf77fd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 23 01:06:21.148685 containerd[1649]: time="2026-01-23T01:06:21.148657643Z" level=info msg="Container 23db593a48f2fc76fe148126a639506b2840600ae1b0bd38d622bfa8bf110377: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:21.153239 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2475341705.mount: Deactivated successfully. Jan 23 01:06:21.163330 containerd[1649]: time="2026-01-23T01:06:21.163293106Z" level=info msg="CreateContainer within sandbox \"f8a264eb4c6e0a3ed937e4f29cba5278b128e7cd271381327833235a6abf77fd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"23db593a48f2fc76fe148126a639506b2840600ae1b0bd38d622bfa8bf110377\"" Jan 23 01:06:21.164711 containerd[1649]: time="2026-01-23T01:06:21.163704885Z" level=info msg="StartContainer for \"23db593a48f2fc76fe148126a639506b2840600ae1b0bd38d622bfa8bf110377\"" Jan 23 01:06:21.165643 containerd[1649]: time="2026-01-23T01:06:21.165609726Z" level=info msg="connecting to shim 23db593a48f2fc76fe148126a639506b2840600ae1b0bd38d622bfa8bf110377" address="unix:///run/containerd/s/21ffeeb0e0cac90bee38dcb4dff81e37c8e5271a9f58029b7af3bf3061b53789" protocol=ttrpc version=3 Jan 23 01:06:21.185653 systemd[1]: Started cri-containerd-23db593a48f2fc76fe148126a639506b2840600ae1b0bd38d622bfa8bf110377.scope - libcontainer container 23db593a48f2fc76fe148126a639506b2840600ae1b0bd38d622bfa8bf110377. Jan 23 01:06:21.231976 containerd[1649]: time="2026-01-23T01:06:21.231943955Z" level=info msg="StartContainer for \"23db593a48f2fc76fe148126a639506b2840600ae1b0bd38d622bfa8bf110377\" returns successfully" Jan 23 01:06:22.021565 kubelet[2882]: E0123 01:06:22.021136 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:22.137618 kubelet[2882]: I0123 01:06:22.137489 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-b666cc779-6djgh" podStartSLOduration=2.428749063 podStartE2EDuration="5.137473362s" podCreationTimestamp="2026-01-23 01:06:17 +0000 UTC" firstStartedPulling="2026-01-23 01:06:18.414286734 +0000 UTC m=+19.486912290" lastFinishedPulling="2026-01-23 01:06:21.123011031 +0000 UTC m=+22.195636589" observedRunningTime="2026-01-23 01:06:22.136482667 +0000 UTC m=+23.209108259" watchObservedRunningTime="2026-01-23 01:06:22.137473362 +0000 UTC m=+23.210098967" Jan 23 01:06:22.153925 kubelet[2882]: E0123 01:06:22.153734 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.153925 kubelet[2882]: W0123 01:06:22.153774 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.153925 kubelet[2882]: E0123 01:06:22.153796 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.154558 kubelet[2882]: E0123 01:06:22.154491 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.154715 kubelet[2882]: W0123 01:06:22.154632 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.154838 kubelet[2882]: E0123 01:06:22.154765 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.155102 kubelet[2882]: E0123 01:06:22.155093 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.155256 kubelet[2882]: W0123 01:06:22.155150 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.155256 kubelet[2882]: E0123 01:06:22.155162 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.155632 kubelet[2882]: E0123 01:06:22.155575 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.155632 kubelet[2882]: W0123 01:06:22.155595 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.155791 kubelet[2882]: E0123 01:06:22.155724 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.156098 kubelet[2882]: E0123 01:06:22.156043 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.156098 kubelet[2882]: W0123 01:06:22.156061 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.156098 kubelet[2882]: E0123 01:06:22.156072 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.156460 kubelet[2882]: E0123 01:06:22.156446 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.157012 kubelet[2882]: W0123 01:06:22.156506 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.157012 kubelet[2882]: E0123 01:06:22.156684 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.157385 kubelet[2882]: E0123 01:06:22.157319 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.157385 kubelet[2882]: W0123 01:06:22.157331 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.157385 kubelet[2882]: E0123 01:06:22.157343 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.158182 kubelet[2882]: E0123 01:06:22.158165 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.158315 kubelet[2882]: W0123 01:06:22.158234 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.158415 kubelet[2882]: E0123 01:06:22.158345 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.158629 kubelet[2882]: E0123 01:06:22.158621 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.158820 kubelet[2882]: W0123 01:06:22.158708 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.158820 kubelet[2882]: E0123 01:06:22.158720 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.159187 kubelet[2882]: E0123 01:06:22.159081 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.159187 kubelet[2882]: W0123 01:06:22.159125 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.159187 kubelet[2882]: E0123 01:06:22.159135 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.159609 kubelet[2882]: E0123 01:06:22.159599 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.159731 kubelet[2882]: W0123 01:06:22.159670 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.159731 kubelet[2882]: E0123 01:06:22.159684 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.160089 kubelet[2882]: E0123 01:06:22.160078 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.160198 kubelet[2882]: W0123 01:06:22.160145 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.160198 kubelet[2882]: E0123 01:06:22.160158 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.160588 kubelet[2882]: E0123 01:06:22.160577 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.160742 kubelet[2882]: W0123 01:06:22.160670 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.160742 kubelet[2882]: E0123 01:06:22.160683 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.160994 kubelet[2882]: E0123 01:06:22.160986 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.161131 kubelet[2882]: W0123 01:06:22.161052 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.161131 kubelet[2882]: E0123 01:06:22.161063 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.161324 kubelet[2882]: E0123 01:06:22.161317 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.161391 kubelet[2882]: W0123 01:06:22.161383 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.161496 kubelet[2882]: E0123 01:06:22.161430 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.188859 kubelet[2882]: E0123 01:06:22.188840 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.189053 kubelet[2882]: W0123 01:06:22.188972 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.189053 kubelet[2882]: E0123 01:06:22.188991 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.189307 kubelet[2882]: E0123 01:06:22.189255 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.189307 kubelet[2882]: W0123 01:06:22.189264 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.189307 kubelet[2882]: E0123 01:06:22.189272 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.189559 kubelet[2882]: E0123 01:06:22.189538 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.189597 kubelet[2882]: W0123 01:06:22.189561 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.189597 kubelet[2882]: E0123 01:06:22.189579 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.189770 kubelet[2882]: E0123 01:06:22.189759 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.189804 kubelet[2882]: W0123 01:06:22.189771 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.189804 kubelet[2882]: E0123 01:06:22.189780 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.189970 kubelet[2882]: E0123 01:06:22.189959 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.189998 kubelet[2882]: W0123 01:06:22.189971 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.189998 kubelet[2882]: E0123 01:06:22.189980 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.190176 kubelet[2882]: E0123 01:06:22.190165 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.190208 kubelet[2882]: W0123 01:06:22.190176 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.190208 kubelet[2882]: E0123 01:06:22.190185 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.190648 kubelet[2882]: E0123 01:06:22.190619 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.190648 kubelet[2882]: W0123 01:06:22.190629 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.190648 kubelet[2882]: E0123 01:06:22.190638 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.190952 kubelet[2882]: E0123 01:06:22.190927 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.190952 kubelet[2882]: W0123 01:06:22.190935 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.190952 kubelet[2882]: E0123 01:06:22.190943 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.191223 kubelet[2882]: E0123 01:06:22.191216 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.191340 kubelet[2882]: W0123 01:06:22.191264 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.191340 kubelet[2882]: E0123 01:06:22.191273 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.191494 kubelet[2882]: E0123 01:06:22.191434 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.191494 kubelet[2882]: W0123 01:06:22.191441 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.191494 kubelet[2882]: E0123 01:06:22.191448 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.191773 kubelet[2882]: E0123 01:06:22.191726 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.191773 kubelet[2882]: W0123 01:06:22.191734 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.191773 kubelet[2882]: E0123 01:06:22.191741 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.191965 kubelet[2882]: E0123 01:06:22.191959 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.192067 kubelet[2882]: W0123 01:06:22.192005 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.192067 kubelet[2882]: E0123 01:06:22.192015 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.192241 kubelet[2882]: E0123 01:06:22.192235 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.192284 kubelet[2882]: W0123 01:06:22.192278 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.192441 kubelet[2882]: E0123 01:06:22.192317 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.192583 kubelet[2882]: E0123 01:06:22.192569 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.192624 kubelet[2882]: W0123 01:06:22.192583 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.192624 kubelet[2882]: E0123 01:06:22.192594 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.192785 kubelet[2882]: E0123 01:06:22.192775 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.192836 kubelet[2882]: W0123 01:06:22.192786 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.192836 kubelet[2882]: E0123 01:06:22.192795 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.192986 kubelet[2882]: E0123 01:06:22.192975 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.193013 kubelet[2882]: W0123 01:06:22.192986 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.193013 kubelet[2882]: E0123 01:06:22.192995 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.193262 kubelet[2882]: E0123 01:06:22.193231 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.193262 kubelet[2882]: W0123 01:06:22.193240 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.193262 kubelet[2882]: E0123 01:06:22.193248 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.193584 kubelet[2882]: E0123 01:06:22.193571 2882 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 01:06:22.193659 kubelet[2882]: W0123 01:06:22.193633 2882 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 01:06:22.193659 kubelet[2882]: E0123 01:06:22.193643 2882 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 01:06:22.414731 containerd[1649]: time="2026-01-23T01:06:22.414697314Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:22.416494 containerd[1649]: time="2026-01-23T01:06:22.416475675Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Jan 23 01:06:22.419500 containerd[1649]: time="2026-01-23T01:06:22.419485646Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:22.422925 containerd[1649]: time="2026-01-23T01:06:22.422901751Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:22.423524 containerd[1649]: time="2026-01-23T01:06:22.423455742Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.300215261s" Jan 23 01:06:22.423524 containerd[1649]: time="2026-01-23T01:06:22.423480123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 23 01:06:22.429582 containerd[1649]: time="2026-01-23T01:06:22.429560981Z" level=info msg="CreateContainer within sandbox \"e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 23 01:06:22.448501 containerd[1649]: time="2026-01-23T01:06:22.447744799Z" level=info msg="Container d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:22.447930 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2493978652.mount: Deactivated successfully. Jan 23 01:06:22.460371 containerd[1649]: time="2026-01-23T01:06:22.460339697Z" level=info msg="CreateContainer within sandbox \"e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038\"" Jan 23 01:06:22.460974 containerd[1649]: time="2026-01-23T01:06:22.460953872Z" level=info msg="StartContainer for \"d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038\"" Jan 23 01:06:22.462283 containerd[1649]: time="2026-01-23T01:06:22.462262085Z" level=info msg="connecting to shim d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038" address="unix:///run/containerd/s/41c0abe3e30297158553a0b66978e82dd0d2a7cdb7498e2e6c20c863e557cf3d" protocol=ttrpc version=3 Jan 23 01:06:22.480665 systemd[1]: Started cri-containerd-d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038.scope - libcontainer container d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038. Jan 23 01:06:22.535460 containerd[1649]: time="2026-01-23T01:06:22.535418849Z" level=info msg="StartContainer for \"d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038\" returns successfully" Jan 23 01:06:22.543294 systemd[1]: cri-containerd-d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038.scope: Deactivated successfully. Jan 23 01:06:22.546561 containerd[1649]: time="2026-01-23T01:06:22.546486847Z" level=info msg="received container exit event container_id:\"d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038\" id:\"d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038\" pid:3565 exited_at:{seconds:1769130382 nanos:546153080}" Jan 23 01:06:22.570019 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d5fc83cc4bd17bc1cc3666772fe6991c643cd69db74635d524d455bff4104038-rootfs.mount: Deactivated successfully. Jan 23 01:06:23.130220 kubelet[2882]: I0123 01:06:23.129885 2882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 01:06:24.021527 kubelet[2882]: E0123 01:06:24.021452 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:25.148911 containerd[1649]: time="2026-01-23T01:06:25.148481998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 23 01:06:26.021966 kubelet[2882]: E0123 01:06:26.021623 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:27.760523 containerd[1649]: time="2026-01-23T01:06:27.760469752Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:27.761885 containerd[1649]: time="2026-01-23T01:06:27.761767510Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Jan 23 01:06:27.763993 containerd[1649]: time="2026-01-23T01:06:27.763973976Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:27.766458 containerd[1649]: time="2026-01-23T01:06:27.766333713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:27.766932 containerd[1649]: time="2026-01-23T01:06:27.766758810Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 2.618170335s" Jan 23 01:06:27.766932 containerd[1649]: time="2026-01-23T01:06:27.766784230Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 23 01:06:27.771769 containerd[1649]: time="2026-01-23T01:06:27.771749122Z" level=info msg="CreateContainer within sandbox \"e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 23 01:06:27.783096 containerd[1649]: time="2026-01-23T01:06:27.782903606Z" level=info msg="Container 292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:27.793916 containerd[1649]: time="2026-01-23T01:06:27.793890972Z" level=info msg="CreateContainer within sandbox \"e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9\"" Jan 23 01:06:27.795399 containerd[1649]: time="2026-01-23T01:06:27.795361109Z" level=info msg="StartContainer for \"292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9\"" Jan 23 01:06:27.796678 containerd[1649]: time="2026-01-23T01:06:27.796654596Z" level=info msg="connecting to shim 292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9" address="unix:///run/containerd/s/41c0abe3e30297158553a0b66978e82dd0d2a7cdb7498e2e6c20c863e557cf3d" protocol=ttrpc version=3 Jan 23 01:06:27.820693 systemd[1]: Started cri-containerd-292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9.scope - libcontainer container 292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9. Jan 23 01:06:27.898474 containerd[1649]: time="2026-01-23T01:06:27.898441047Z" level=info msg="StartContainer for \"292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9\" returns successfully" Jan 23 01:06:28.023829 kubelet[2882]: E0123 01:06:28.021922 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:29.252446 systemd[1]: cri-containerd-292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9.scope: Deactivated successfully. Jan 23 01:06:29.253002 systemd[1]: cri-containerd-292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9.scope: Consumed 585ms CPU time, 193.2M memory peak, 171.3M written to disk. Jan 23 01:06:29.254019 containerd[1649]: time="2026-01-23T01:06:29.253987840Z" level=info msg="received container exit event container_id:\"292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9\" id:\"292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9\" pid:3624 exited_at:{seconds:1769130389 nanos:253702190}" Jan 23 01:06:29.267064 kubelet[2882]: I0123 01:06:29.267039 2882 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Jan 23 01:06:29.280578 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-292ed70814cb80aa4e5b39a178f247e783ad297f4c49899efc1e7a67ebc99cd9-rootfs.mount: Deactivated successfully. Jan 23 01:06:30.685342 systemd[1]: Created slice kubepods-burstable-pod2bf6baca_f998_4b11_9305_bcbec3ac4e25.slice - libcontainer container kubepods-burstable-pod2bf6baca_f998_4b11_9305_bcbec3ac4e25.slice. Jan 23 01:06:30.702575 systemd[1]: Created slice kubepods-besteffort-podf7eac782_ccd6_467d_a1a8_1d1f5f096853.slice - libcontainer container kubepods-besteffort-podf7eac782_ccd6_467d_a1a8_1d1f5f096853.slice. Jan 23 01:06:30.751649 systemd[1]: Created slice kubepods-burstable-pod175eaffb_3ffd_4a3f_999d_7e0a67b54b14.slice - libcontainer container kubepods-burstable-pod175eaffb_3ffd_4a3f_999d_7e0a67b54b14.slice. Jan 23 01:06:30.756891 kubelet[2882]: I0123 01:06:30.756562 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj4bx\" (UniqueName: \"kubernetes.io/projected/f7eac782-ccd6-467d-a1a8-1d1f5f096853-kube-api-access-wj4bx\") pod \"calico-kube-controllers-7cd6c7ccb7-hrmnq\" (UID: \"f7eac782-ccd6-467d-a1a8-1d1f5f096853\") " pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" Jan 23 01:06:30.756891 kubelet[2882]: I0123 01:06:30.756625 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ade066f3-57a6-4b59-8736-58fe8e8e36bc-calico-apiserver-certs\") pod \"calico-apiserver-57669cbdbb-hrfjp\" (UID: \"ade066f3-57a6-4b59-8736-58fe8e8e36bc\") " pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" Jan 23 01:06:30.756891 kubelet[2882]: I0123 01:06:30.756646 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw5bp\" (UniqueName: \"kubernetes.io/projected/ade066f3-57a6-4b59-8736-58fe8e8e36bc-kube-api-access-zw5bp\") pod \"calico-apiserver-57669cbdbb-hrfjp\" (UID: \"ade066f3-57a6-4b59-8736-58fe8e8e36bc\") " pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" Jan 23 01:06:30.756891 kubelet[2882]: I0123 01:06:30.756689 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vtbp\" (UniqueName: \"kubernetes.io/projected/2bf6baca-f998-4b11-9305-bcbec3ac4e25-kube-api-access-9vtbp\") pod \"coredns-66bc5c9577-twdcc\" (UID: \"2bf6baca-f998-4b11-9305-bcbec3ac4e25\") " pod="kube-system/coredns-66bc5c9577-twdcc" Jan 23 01:06:30.756891 kubelet[2882]: I0123 01:06:30.756720 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7eac782-ccd6-467d-a1a8-1d1f5f096853-tigera-ca-bundle\") pod \"calico-kube-controllers-7cd6c7ccb7-hrmnq\" (UID: \"f7eac782-ccd6-467d-a1a8-1d1f5f096853\") " pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" Jan 23 01:06:30.757403 kubelet[2882]: I0123 01:06:30.756770 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/175eaffb-3ffd-4a3f-999d-7e0a67b54b14-config-volume\") pod \"coredns-66bc5c9577-m6g24\" (UID: \"175eaffb-3ffd-4a3f-999d-7e0a67b54b14\") " pod="kube-system/coredns-66bc5c9577-m6g24" Jan 23 01:06:30.757403 kubelet[2882]: I0123 01:06:30.756787 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2bf6baca-f998-4b11-9305-bcbec3ac4e25-config-volume\") pod \"coredns-66bc5c9577-twdcc\" (UID: \"2bf6baca-f998-4b11-9305-bcbec3ac4e25\") " pod="kube-system/coredns-66bc5c9577-twdcc" Jan 23 01:06:30.757403 kubelet[2882]: I0123 01:06:30.756806 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxfq9\" (UniqueName: \"kubernetes.io/projected/175eaffb-3ffd-4a3f-999d-7e0a67b54b14-kube-api-access-wxfq9\") pod \"coredns-66bc5c9577-m6g24\" (UID: \"175eaffb-3ffd-4a3f-999d-7e0a67b54b14\") " pod="kube-system/coredns-66bc5c9577-m6g24" Jan 23 01:06:30.764608 systemd[1]: Created slice kubepods-besteffort-podbfc1d39e_fed0_4ad0_8a64_aa0c649c314e.slice - libcontainer container kubepods-besteffort-podbfc1d39e_fed0_4ad0_8a64_aa0c649c314e.slice. Jan 23 01:06:30.771381 containerd[1649]: time="2026-01-23T01:06:30.771318967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p7kwl,Uid:bfc1d39e-fed0-4ad0-8a64-aa0c649c314e,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:30.781826 systemd[1]: Created slice kubepods-besteffort-podade066f3_57a6_4b59_8736_58fe8e8e36bc.slice - libcontainer container kubepods-besteffort-podade066f3_57a6_4b59_8736_58fe8e8e36bc.slice. Jan 23 01:06:30.791496 systemd[1]: Created slice kubepods-besteffort-pod38be9c86_8462_40da_b6c5_51dc537715d5.slice - libcontainer container kubepods-besteffort-pod38be9c86_8462_40da_b6c5_51dc537715d5.slice. Jan 23 01:06:30.803150 systemd[1]: Created slice kubepods-besteffort-podabca6cb1_d45d_4716_90fc_9fea5bf2bb4c.slice - libcontainer container kubepods-besteffort-podabca6cb1_d45d_4716_90fc_9fea5bf2bb4c.slice. Jan 23 01:06:30.812118 systemd[1]: Created slice kubepods-besteffort-pod644b330e_04f5_4cea_ab8f_48f61da30d02.slice - libcontainer container kubepods-besteffort-pod644b330e_04f5_4cea_ab8f_48f61da30d02.slice. Jan 23 01:06:30.857924 kubelet[2882]: I0123 01:06:30.857873 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-backend-key-pair\") pod \"whisker-86444447bc-c8mfk\" (UID: \"644b330e-04f5-4cea-ab8f-48f61da30d02\") " pod="calico-system/whisker-86444447bc-c8mfk" Jan 23 01:06:30.861238 kubelet[2882]: I0123 01:06:30.861177 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/38be9c86-8462-40da-b6c5-51dc537715d5-calico-apiserver-certs\") pod \"calico-apiserver-57669cbdbb-pnljq\" (UID: \"38be9c86-8462-40da-b6c5-51dc537715d5\") " pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" Jan 23 01:06:30.861686 kubelet[2882]: I0123 01:06:30.861549 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zw7j\" (UniqueName: \"kubernetes.io/projected/644b330e-04f5-4cea-ab8f-48f61da30d02-kube-api-access-9zw7j\") pod \"whisker-86444447bc-c8mfk\" (UID: \"644b330e-04f5-4cea-ab8f-48f61da30d02\") " pod="calico-system/whisker-86444447bc-c8mfk" Jan 23 01:06:30.861686 kubelet[2882]: I0123 01:06:30.861625 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abca6cb1-d45d-4716-90fc-9fea5bf2bb4c-config\") pod \"goldmane-7c778bb748-sx6dp\" (UID: \"abca6cb1-d45d-4716-90fc-9fea5bf2bb4c\") " pod="calico-system/goldmane-7c778bb748-sx6dp" Jan 23 01:06:30.861686 kubelet[2882]: I0123 01:06:30.861643 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abca6cb1-d45d-4716-90fc-9fea5bf2bb4c-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-sx6dp\" (UID: \"abca6cb1-d45d-4716-90fc-9fea5bf2bb4c\") " pod="calico-system/goldmane-7c778bb748-sx6dp" Jan 23 01:06:30.861686 kubelet[2882]: I0123 01:06:30.861658 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/abca6cb1-d45d-4716-90fc-9fea5bf2bb4c-goldmane-key-pair\") pod \"goldmane-7c778bb748-sx6dp\" (UID: \"abca6cb1-d45d-4716-90fc-9fea5bf2bb4c\") " pod="calico-system/goldmane-7c778bb748-sx6dp" Jan 23 01:06:30.861979 kubelet[2882]: I0123 01:06:30.861675 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-ca-bundle\") pod \"whisker-86444447bc-c8mfk\" (UID: \"644b330e-04f5-4cea-ab8f-48f61da30d02\") " pod="calico-system/whisker-86444447bc-c8mfk" Jan 23 01:06:30.861979 kubelet[2882]: I0123 01:06:30.861936 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nhb2\" (UniqueName: \"kubernetes.io/projected/38be9c86-8462-40da-b6c5-51dc537715d5-kube-api-access-5nhb2\") pod \"calico-apiserver-57669cbdbb-pnljq\" (UID: \"38be9c86-8462-40da-b6c5-51dc537715d5\") " pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" Jan 23 01:06:30.861979 kubelet[2882]: I0123 01:06:30.861958 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwnb4\" (UniqueName: \"kubernetes.io/projected/abca6cb1-d45d-4716-90fc-9fea5bf2bb4c-kube-api-access-nwnb4\") pod \"goldmane-7c778bb748-sx6dp\" (UID: \"abca6cb1-d45d-4716-90fc-9fea5bf2bb4c\") " pod="calico-system/goldmane-7c778bb748-sx6dp" Jan 23 01:06:30.865537 containerd[1649]: time="2026-01-23T01:06:30.864691553Z" level=error msg="Failed to destroy network for sandbox \"2b405d03a9da0afd1cbe73abc5b170ee3df76bfd33a0855a5e3bc74bade12ea5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:30.867923 containerd[1649]: time="2026-01-23T01:06:30.867838782Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p7kwl,Uid:bfc1d39e-fed0-4ad0-8a64-aa0c649c314e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b405d03a9da0afd1cbe73abc5b170ee3df76bfd33a0855a5e3bc74bade12ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:30.870049 kubelet[2882]: E0123 01:06:30.870026 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b405d03a9da0afd1cbe73abc5b170ee3df76bfd33a0855a5e3bc74bade12ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:30.870114 kubelet[2882]: E0123 01:06:30.870067 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b405d03a9da0afd1cbe73abc5b170ee3df76bfd33a0855a5e3bc74bade12ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p7kwl" Jan 23 01:06:30.870114 kubelet[2882]: E0123 01:06:30.870083 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b405d03a9da0afd1cbe73abc5b170ee3df76bfd33a0855a5e3bc74bade12ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p7kwl" Jan 23 01:06:30.870166 kubelet[2882]: E0123 01:06:30.870124 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b405d03a9da0afd1cbe73abc5b170ee3df76bfd33a0855a5e3bc74bade12ea5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:30.871752 systemd[1]: run-netns-cni\x2de30fc75c\x2de4a3\x2d3fe8\x2dd073\x2d59d6f9594fb0.mount: Deactivated successfully. Jan 23 01:06:31.002873 containerd[1649]: time="2026-01-23T01:06:31.002695009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-twdcc,Uid:2bf6baca-f998-4b11-9305-bcbec3ac4e25,Namespace:kube-system,Attempt:0,}" Jan 23 01:06:31.017706 containerd[1649]: time="2026-01-23T01:06:31.017628683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd6c7ccb7-hrmnq,Uid:f7eac782-ccd6-467d-a1a8-1d1f5f096853,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:31.063337 containerd[1649]: time="2026-01-23T01:06:31.063294504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-m6g24,Uid:175eaffb-3ffd-4a3f-999d-7e0a67b54b14,Namespace:kube-system,Attempt:0,}" Jan 23 01:06:31.089398 containerd[1649]: time="2026-01-23T01:06:31.089326617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-hrfjp,Uid:ade066f3-57a6-4b59-8736-58fe8e8e36bc,Namespace:calico-apiserver,Attempt:0,}" Jan 23 01:06:31.098532 containerd[1649]: time="2026-01-23T01:06:31.098470440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-pnljq,Uid:38be9c86-8462-40da-b6c5-51dc537715d5,Namespace:calico-apiserver,Attempt:0,}" Jan 23 01:06:31.099355 containerd[1649]: time="2026-01-23T01:06:31.098958222Z" level=error msg="Failed to destroy network for sandbox \"6019366250fc8c6ae0edf48712bddcb7d354eb64da54743c56d29972f81fb600\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.102735 containerd[1649]: time="2026-01-23T01:06:31.102699800Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-twdcc,Uid:2bf6baca-f998-4b11-9305-bcbec3ac4e25,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6019366250fc8c6ae0edf48712bddcb7d354eb64da54743c56d29972f81fb600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.102989 kubelet[2882]: E0123 01:06:31.102963 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6019366250fc8c6ae0edf48712bddcb7d354eb64da54743c56d29972f81fb600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.103195 kubelet[2882]: E0123 01:06:31.103077 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6019366250fc8c6ae0edf48712bddcb7d354eb64da54743c56d29972f81fb600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-twdcc" Jan 23 01:06:31.103195 kubelet[2882]: E0123 01:06:31.103095 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6019366250fc8c6ae0edf48712bddcb7d354eb64da54743c56d29972f81fb600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-twdcc" Jan 23 01:06:31.103195 kubelet[2882]: E0123 01:06:31.103158 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-twdcc_kube-system(2bf6baca-f998-4b11-9305-bcbec3ac4e25)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-twdcc_kube-system(2bf6baca-f998-4b11-9305-bcbec3ac4e25)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6019366250fc8c6ae0edf48712bddcb7d354eb64da54743c56d29972f81fb600\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-twdcc" podUID="2bf6baca-f998-4b11-9305-bcbec3ac4e25" Jan 23 01:06:31.111625 containerd[1649]: time="2026-01-23T01:06:31.111470412Z" level=error msg="Failed to destroy network for sandbox \"a5b6e6c623cb4acc097aae9f251fc6262e810b1a0f4cc343f5009f6f4aa812cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.115284 containerd[1649]: time="2026-01-23T01:06:31.115250847Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd6c7ccb7-hrmnq,Uid:f7eac782-ccd6-467d-a1a8-1d1f5f096853,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5b6e6c623cb4acc097aae9f251fc6262e810b1a0f4cc343f5009f6f4aa812cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.115624 kubelet[2882]: E0123 01:06:31.115498 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5b6e6c623cb4acc097aae9f251fc6262e810b1a0f4cc343f5009f6f4aa812cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.115979 kubelet[2882]: E0123 01:06:31.115701 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5b6e6c623cb4acc097aae9f251fc6262e810b1a0f4cc343f5009f6f4aa812cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" Jan 23 01:06:31.115979 kubelet[2882]: E0123 01:06:31.115720 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5b6e6c623cb4acc097aae9f251fc6262e810b1a0f4cc343f5009f6f4aa812cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" Jan 23 01:06:31.115979 kubelet[2882]: E0123 01:06:31.115761 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cd6c7ccb7-hrmnq_calico-system(f7eac782-ccd6-467d-a1a8-1d1f5f096853)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cd6c7ccb7-hrmnq_calico-system(f7eac782-ccd6-467d-a1a8-1d1f5f096853)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a5b6e6c623cb4acc097aae9f251fc6262e810b1a0f4cc343f5009f6f4aa812cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:06:31.117565 containerd[1649]: time="2026-01-23T01:06:31.117498592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sx6dp,Uid:abca6cb1-d45d-4716-90fc-9fea5bf2bb4c,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:31.120889 containerd[1649]: time="2026-01-23T01:06:31.120867740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86444447bc-c8mfk,Uid:644b330e-04f5-4cea-ab8f-48f61da30d02,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:31.175074 containerd[1649]: time="2026-01-23T01:06:31.174849968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 23 01:06:31.195533 containerd[1649]: time="2026-01-23T01:06:31.195442636Z" level=error msg="Failed to destroy network for sandbox \"051a02a42f8cb88e8022a90b7407a966a539be4d89bd4ea3a137299628218777\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.204501 containerd[1649]: time="2026-01-23T01:06:31.204368614Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-hrfjp,Uid:ade066f3-57a6-4b59-8736-58fe8e8e36bc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"051a02a42f8cb88e8022a90b7407a966a539be4d89bd4ea3a137299628218777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.204965 kubelet[2882]: E0123 01:06:31.204628 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"051a02a42f8cb88e8022a90b7407a966a539be4d89bd4ea3a137299628218777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.204965 kubelet[2882]: E0123 01:06:31.204848 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"051a02a42f8cb88e8022a90b7407a966a539be4d89bd4ea3a137299628218777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" Jan 23 01:06:31.204965 kubelet[2882]: E0123 01:06:31.204864 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"051a02a42f8cb88e8022a90b7407a966a539be4d89bd4ea3a137299628218777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" Jan 23 01:06:31.205174 kubelet[2882]: E0123 01:06:31.205102 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57669cbdbb-hrfjp_calico-apiserver(ade066f3-57a6-4b59-8736-58fe8e8e36bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57669cbdbb-hrfjp_calico-apiserver(ade066f3-57a6-4b59-8736-58fe8e8e36bc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"051a02a42f8cb88e8022a90b7407a966a539be4d89bd4ea3a137299628218777\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:06:31.212655 containerd[1649]: time="2026-01-23T01:06:31.211948730Z" level=error msg="Failed to destroy network for sandbox \"16d5f4b5443ca306674b68a9bbfd1a2caa6be471e6b691893cace8ed2157fd78\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.214482 containerd[1649]: time="2026-01-23T01:06:31.214445954Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-m6g24,Uid:175eaffb-3ffd-4a3f-999d-7e0a67b54b14,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d5f4b5443ca306674b68a9bbfd1a2caa6be471e6b691893cace8ed2157fd78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.214871 kubelet[2882]: E0123 01:06:31.214847 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d5f4b5443ca306674b68a9bbfd1a2caa6be471e6b691893cace8ed2157fd78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.214970 kubelet[2882]: E0123 01:06:31.214953 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d5f4b5443ca306674b68a9bbfd1a2caa6be471e6b691893cace8ed2157fd78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-m6g24" Jan 23 01:06:31.215031 kubelet[2882]: E0123 01:06:31.215022 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d5f4b5443ca306674b68a9bbfd1a2caa6be471e6b691893cace8ed2157fd78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-m6g24" Jan 23 01:06:31.215136 kubelet[2882]: E0123 01:06:31.215120 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-m6g24_kube-system(175eaffb-3ffd-4a3f-999d-7e0a67b54b14)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-m6g24_kube-system(175eaffb-3ffd-4a3f-999d-7e0a67b54b14)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"16d5f4b5443ca306674b68a9bbfd1a2caa6be471e6b691893cace8ed2157fd78\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-m6g24" podUID="175eaffb-3ffd-4a3f-999d-7e0a67b54b14" Jan 23 01:06:31.224131 containerd[1649]: time="2026-01-23T01:06:31.224102600Z" level=error msg="Failed to destroy network for sandbox \"22d4ea5a612f054a4435c3876e45f1a51b780a162baf8e6430db8352e855515e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.226174 containerd[1649]: time="2026-01-23T01:06:31.225997532Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sx6dp,Uid:abca6cb1-d45d-4716-90fc-9fea5bf2bb4c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"22d4ea5a612f054a4435c3876e45f1a51b780a162baf8e6430db8352e855515e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.226346 kubelet[2882]: E0123 01:06:31.226154 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22d4ea5a612f054a4435c3876e45f1a51b780a162baf8e6430db8352e855515e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.226346 kubelet[2882]: E0123 01:06:31.226307 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22d4ea5a612f054a4435c3876e45f1a51b780a162baf8e6430db8352e855515e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sx6dp" Jan 23 01:06:31.226346 kubelet[2882]: E0123 01:06:31.226323 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22d4ea5a612f054a4435c3876e45f1a51b780a162baf8e6430db8352e855515e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sx6dp" Jan 23 01:06:31.226531 kubelet[2882]: E0123 01:06:31.226477 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-sx6dp_calico-system(abca6cb1-d45d-4716-90fc-9fea5bf2bb4c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-sx6dp_calico-system(abca6cb1-d45d-4716-90fc-9fea5bf2bb4c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"22d4ea5a612f054a4435c3876e45f1a51b780a162baf8e6430db8352e855515e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:06:31.245144 containerd[1649]: time="2026-01-23T01:06:31.245107493Z" level=error msg="Failed to destroy network for sandbox \"7b4d1a7cf9e36dea8d8d9dd498e14b5bcf5d7f3794d2515df4267a49cb0a7cc7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.245909 containerd[1649]: time="2026-01-23T01:06:31.245884776Z" level=error msg="Failed to destroy network for sandbox \"1af4fd13780f9997fa17fd4d3cfc732f00cfaccf7d92ec3ed8562cc864f798d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.247603 containerd[1649]: time="2026-01-23T01:06:31.247498123Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-pnljq,Uid:38be9c86-8462-40da-b6c5-51dc537715d5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b4d1a7cf9e36dea8d8d9dd498e14b5bcf5d7f3794d2515df4267a49cb0a7cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.247720 kubelet[2882]: E0123 01:06:31.247676 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b4d1a7cf9e36dea8d8d9dd498e14b5bcf5d7f3794d2515df4267a49cb0a7cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.247769 kubelet[2882]: E0123 01:06:31.247716 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b4d1a7cf9e36dea8d8d9dd498e14b5bcf5d7f3794d2515df4267a49cb0a7cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" Jan 23 01:06:31.247769 kubelet[2882]: E0123 01:06:31.247735 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b4d1a7cf9e36dea8d8d9dd498e14b5bcf5d7f3794d2515df4267a49cb0a7cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" Jan 23 01:06:31.248613 kubelet[2882]: E0123 01:06:31.247778 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57669cbdbb-pnljq_calico-apiserver(38be9c86-8462-40da-b6c5-51dc537715d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57669cbdbb-pnljq_calico-apiserver(38be9c86-8462-40da-b6c5-51dc537715d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b4d1a7cf9e36dea8d8d9dd498e14b5bcf5d7f3794d2515df4267a49cb0a7cc7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:06:31.248934 containerd[1649]: time="2026-01-23T01:06:31.248909694Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86444447bc-c8mfk,Uid:644b330e-04f5-4cea-ab8f-48f61da30d02,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1af4fd13780f9997fa17fd4d3cfc732f00cfaccf7d92ec3ed8562cc864f798d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.249213 kubelet[2882]: E0123 01:06:31.249138 2882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1af4fd13780f9997fa17fd4d3cfc732f00cfaccf7d92ec3ed8562cc864f798d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 01:06:31.249458 kubelet[2882]: E0123 01:06:31.249280 2882 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1af4fd13780f9997fa17fd4d3cfc732f00cfaccf7d92ec3ed8562cc864f798d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-86444447bc-c8mfk" Jan 23 01:06:31.249458 kubelet[2882]: E0123 01:06:31.249309 2882 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1af4fd13780f9997fa17fd4d3cfc732f00cfaccf7d92ec3ed8562cc864f798d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-86444447bc-c8mfk" Jan 23 01:06:31.249458 kubelet[2882]: E0123 01:06:31.249348 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-86444447bc-c8mfk_calico-system(644b330e-04f5-4cea-ab8f-48f61da30d02)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-86444447bc-c8mfk_calico-system(644b330e-04f5-4cea-ab8f-48f61da30d02)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1af4fd13780f9997fa17fd4d3cfc732f00cfaccf7d92ec3ed8562cc864f798d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-86444447bc-c8mfk" podUID="644b330e-04f5-4cea-ab8f-48f61da30d02" Jan 23 01:06:35.697707 kubelet[2882]: I0123 01:06:35.697672 2882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 01:06:36.423226 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1973511018.mount: Deactivated successfully. Jan 23 01:06:36.460590 containerd[1649]: time="2026-01-23T01:06:36.460055310Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:36.461470 containerd[1649]: time="2026-01-23T01:06:36.461452537Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Jan 23 01:06:36.463133 containerd[1649]: time="2026-01-23T01:06:36.463114974Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:36.465735 containerd[1649]: time="2026-01-23T01:06:36.465717524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 01:06:36.466108 containerd[1649]: time="2026-01-23T01:06:36.466086419Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 5.291193422s" Jan 23 01:06:36.466146 containerd[1649]: time="2026-01-23T01:06:36.466116396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 23 01:06:36.479366 containerd[1649]: time="2026-01-23T01:06:36.479339027Z" level=info msg="CreateContainer within sandbox \"e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 23 01:06:36.492934 containerd[1649]: time="2026-01-23T01:06:36.492526374Z" level=info msg="Container 90a8e7f73e46520c1cd39f9adb57a6e6f0fa83509935a8dc22f17774d19623c4: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:36.508680 containerd[1649]: time="2026-01-23T01:06:36.508647025Z" level=info msg="CreateContainer within sandbox \"e39ccd2c9eb4de02b69a39d7a6fbf9d3402b3ba196cfb3d130342e0377093f42\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"90a8e7f73e46520c1cd39f9adb57a6e6f0fa83509935a8dc22f17774d19623c4\"" Jan 23 01:06:36.509375 containerd[1649]: time="2026-01-23T01:06:36.509355241Z" level=info msg="StartContainer for \"90a8e7f73e46520c1cd39f9adb57a6e6f0fa83509935a8dc22f17774d19623c4\"" Jan 23 01:06:36.510788 containerd[1649]: time="2026-01-23T01:06:36.510766831Z" level=info msg="connecting to shim 90a8e7f73e46520c1cd39f9adb57a6e6f0fa83509935a8dc22f17774d19623c4" address="unix:///run/containerd/s/41c0abe3e30297158553a0b66978e82dd0d2a7cdb7498e2e6c20c863e557cf3d" protocol=ttrpc version=3 Jan 23 01:06:36.562718 systemd[1]: Started cri-containerd-90a8e7f73e46520c1cd39f9adb57a6e6f0fa83509935a8dc22f17774d19623c4.scope - libcontainer container 90a8e7f73e46520c1cd39f9adb57a6e6f0fa83509935a8dc22f17774d19623c4. Jan 23 01:06:36.639755 containerd[1649]: time="2026-01-23T01:06:36.639678795Z" level=info msg="StartContainer for \"90a8e7f73e46520c1cd39f9adb57a6e6f0fa83509935a8dc22f17774d19623c4\" returns successfully" Jan 23 01:06:36.776724 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 23 01:06:36.776891 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 23 01:06:37.008336 kubelet[2882]: I0123 01:06:37.008300 2882 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-backend-key-pair\") pod \"644b330e-04f5-4cea-ab8f-48f61da30d02\" (UID: \"644b330e-04f5-4cea-ab8f-48f61da30d02\") " Jan 23 01:06:37.008336 kubelet[2882]: I0123 01:06:37.008339 2882 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-ca-bundle\") pod \"644b330e-04f5-4cea-ab8f-48f61da30d02\" (UID: \"644b330e-04f5-4cea-ab8f-48f61da30d02\") " Jan 23 01:06:37.008336 kubelet[2882]: I0123 01:06:37.008360 2882 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zw7j\" (UniqueName: \"kubernetes.io/projected/644b330e-04f5-4cea-ab8f-48f61da30d02-kube-api-access-9zw7j\") pod \"644b330e-04f5-4cea-ab8f-48f61da30d02\" (UID: \"644b330e-04f5-4cea-ab8f-48f61da30d02\") " Jan 23 01:06:37.011812 kubelet[2882]: I0123 01:06:37.011734 2882 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "644b330e-04f5-4cea-ab8f-48f61da30d02" (UID: "644b330e-04f5-4cea-ab8f-48f61da30d02"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 23 01:06:37.012887 kubelet[2882]: I0123 01:06:37.012856 2882 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "644b330e-04f5-4cea-ab8f-48f61da30d02" (UID: "644b330e-04f5-4cea-ab8f-48f61da30d02"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 23 01:06:37.014628 kubelet[2882]: I0123 01:06:37.014590 2882 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644b330e-04f5-4cea-ab8f-48f61da30d02-kube-api-access-9zw7j" (OuterVolumeSpecName: "kube-api-access-9zw7j") pod "644b330e-04f5-4cea-ab8f-48f61da30d02" (UID: "644b330e-04f5-4cea-ab8f-48f61da30d02"). InnerVolumeSpecName "kube-api-access-9zw7j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 23 01:06:37.028591 systemd[1]: Removed slice kubepods-besteffort-pod644b330e_04f5_4cea_ab8f_48f61da30d02.slice - libcontainer container kubepods-besteffort-pod644b330e_04f5_4cea_ab8f_48f61da30d02.slice. Jan 23 01:06:37.109324 kubelet[2882]: I0123 01:06:37.109265 2882 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-backend-key-pair\") on node \"ci-4459-2-2-n-615049e46b\" DevicePath \"\"" Jan 23 01:06:37.109324 kubelet[2882]: I0123 01:06:37.109290 2882 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/644b330e-04f5-4cea-ab8f-48f61da30d02-whisker-ca-bundle\") on node \"ci-4459-2-2-n-615049e46b\" DevicePath \"\"" Jan 23 01:06:37.109324 kubelet[2882]: I0123 01:06:37.109302 2882 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9zw7j\" (UniqueName: \"kubernetes.io/projected/644b330e-04f5-4cea-ab8f-48f61da30d02-kube-api-access-9zw7j\") on node \"ci-4459-2-2-n-615049e46b\" DevicePath \"\"" Jan 23 01:06:37.215286 kubelet[2882]: I0123 01:06:37.214845 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-p8jwb" podStartSLOduration=1.324971386 podStartE2EDuration="19.214815725s" podCreationTimestamp="2026-01-23 01:06:18 +0000 UTC" firstStartedPulling="2026-01-23 01:06:18.577055386 +0000 UTC m=+19.649680942" lastFinishedPulling="2026-01-23 01:06:36.466899726 +0000 UTC m=+37.539525281" observedRunningTime="2026-01-23 01:06:37.213876166 +0000 UTC m=+38.286501745" watchObservedRunningTime="2026-01-23 01:06:37.214815725 +0000 UTC m=+38.287441312" Jan 23 01:06:37.280670 systemd[1]: Created slice kubepods-besteffort-pod1e833cc8_d29c_40ae_955f_226c562444ef.slice - libcontainer container kubepods-besteffort-pod1e833cc8_d29c_40ae_955f_226c562444ef.slice. Jan 23 01:06:37.412122 kubelet[2882]: I0123 01:06:37.412073 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1e833cc8-d29c-40ae-955f-226c562444ef-whisker-backend-key-pair\") pod \"whisker-86c8df774c-zx58q\" (UID: \"1e833cc8-d29c-40ae-955f-226c562444ef\") " pod="calico-system/whisker-86c8df774c-zx58q" Jan 23 01:06:37.412122 kubelet[2882]: I0123 01:06:37.412121 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zslbx\" (UniqueName: \"kubernetes.io/projected/1e833cc8-d29c-40ae-955f-226c562444ef-kube-api-access-zslbx\") pod \"whisker-86c8df774c-zx58q\" (UID: \"1e833cc8-d29c-40ae-955f-226c562444ef\") " pod="calico-system/whisker-86c8df774c-zx58q" Jan 23 01:06:37.412316 kubelet[2882]: I0123 01:06:37.412138 2882 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e833cc8-d29c-40ae-955f-226c562444ef-whisker-ca-bundle\") pod \"whisker-86c8df774c-zx58q\" (UID: \"1e833cc8-d29c-40ae-955f-226c562444ef\") " pod="calico-system/whisker-86c8df774c-zx58q" Jan 23 01:06:37.424210 systemd[1]: var-lib-kubelet-pods-644b330e\x2d04f5\x2d4cea\x2dab8f\x2d48f61da30d02-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9zw7j.mount: Deactivated successfully. Jan 23 01:06:37.424303 systemd[1]: var-lib-kubelet-pods-644b330e\x2d04f5\x2d4cea\x2dab8f\x2d48f61da30d02-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 23 01:06:37.589876 containerd[1649]: time="2026-01-23T01:06:37.589813733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86c8df774c-zx58q,Uid:1e833cc8-d29c-40ae-955f-226c562444ef,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:37.946062 systemd-networkd[1330]: cali202f396e8a5: Link UP Jan 23 01:06:37.947220 systemd-networkd[1330]: cali202f396e8a5: Gained carrier Jan 23 01:06:37.978076 containerd[1649]: 2026-01-23 01:06:37.632 [INFO][3975] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 23 01:06:37.978076 containerd[1649]: 2026-01-23 01:06:37.803 [INFO][3975] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0 whisker-86c8df774c- calico-system 1e833cc8-d29c-40ae-955f-226c562444ef 877 0 2026-01-23 01:06:37 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:86c8df774c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b whisker-86c8df774c-zx58q eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali202f396e8a5 [] [] }} ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-" Jan 23 01:06:37.978076 containerd[1649]: 2026-01-23 01:06:37.806 [INFO][3975] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" Jan 23 01:06:37.978076 containerd[1649]: 2026-01-23 01:06:37.869 [INFO][3987] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" HandleID="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Workload="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.870 [INFO][3987] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" HandleID="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Workload="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5870), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-615049e46b", "pod":"whisker-86c8df774c-zx58q", "timestamp":"2026-01-23 01:06:37.869860709 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.870 [INFO][3987] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.870 [INFO][3987] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.870 [INFO][3987] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.882 [INFO][3987] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.890 [INFO][3987] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.897 [INFO][3987] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.900 [INFO][3987] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.978489 containerd[1649]: 2026-01-23 01:06:37.904 [INFO][3987] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.980533 containerd[1649]: 2026-01-23 01:06:37.904 [INFO][3987] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.980533 containerd[1649]: 2026-01-23 01:06:37.906 [INFO][3987] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf Jan 23 01:06:37.980533 containerd[1649]: 2026-01-23 01:06:37.912 [INFO][3987] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.980533 containerd[1649]: 2026-01-23 01:06:37.921 [INFO][3987] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.193/26] block=192.168.65.192/26 handle="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.980533 containerd[1649]: 2026-01-23 01:06:37.921 [INFO][3987] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.193/26] handle="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:37.980533 containerd[1649]: 2026-01-23 01:06:37.921 [INFO][3987] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:37.980533 containerd[1649]: 2026-01-23 01:06:37.922 [INFO][3987] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.193/26] IPv6=[] ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" HandleID="k8s-pod-network.febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Workload="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" Jan 23 01:06:37.981794 containerd[1649]: 2026-01-23 01:06:37.929 [INFO][3975] cni-plugin/k8s.go 418: Populated endpoint ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0", GenerateName:"whisker-86c8df774c-", Namespace:"calico-system", SelfLink:"", UID:"1e833cc8-d29c-40ae-955f-226c562444ef", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"86c8df774c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"whisker-86c8df774c-zx58q", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.65.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali202f396e8a5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:37.981794 containerd[1649]: 2026-01-23 01:06:37.930 [INFO][3975] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.193/32] ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" Jan 23 01:06:37.981944 containerd[1649]: 2026-01-23 01:06:37.930 [INFO][3975] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali202f396e8a5 ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" Jan 23 01:06:37.981944 containerd[1649]: 2026-01-23 01:06:37.948 [INFO][3975] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" Jan 23 01:06:37.982010 containerd[1649]: 2026-01-23 01:06:37.948 [INFO][3975] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0", GenerateName:"whisker-86c8df774c-", Namespace:"calico-system", SelfLink:"", UID:"1e833cc8-d29c-40ae-955f-226c562444ef", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"86c8df774c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf", Pod:"whisker-86c8df774c-zx58q", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.65.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali202f396e8a5", MAC:"ea:fc:b9:2b:5b:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:37.982081 containerd[1649]: 2026-01-23 01:06:37.974 [INFO][3975] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" Namespace="calico-system" Pod="whisker-86c8df774c-zx58q" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-whisker--86c8df774c--zx58q-eth0" Jan 23 01:06:38.045974 containerd[1649]: time="2026-01-23T01:06:38.045924709Z" level=info msg="connecting to shim febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf" address="unix:///run/containerd/s/e1836e47e1a0659dab8ae2fa207807d7a2197d150119695f6786a77bbd470474" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:38.073704 systemd[1]: Started cri-containerd-febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf.scope - libcontainer container febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf. Jan 23 01:06:38.129903 containerd[1649]: time="2026-01-23T01:06:38.129862125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86c8df774c-zx58q,Uid:1e833cc8-d29c-40ae-955f-226c562444ef,Namespace:calico-system,Attempt:0,} returns sandbox id \"febc34440b56f32ae6279cec077934f351ce227f56d5390456537de864365cbf\"" Jan 23 01:06:38.132005 containerd[1649]: time="2026-01-23T01:06:38.131913561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 01:06:38.485276 containerd[1649]: time="2026-01-23T01:06:38.485169875Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:38.693748 containerd[1649]: time="2026-01-23T01:06:38.693635854Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 01:06:38.693748 containerd[1649]: time="2026-01-23T01:06:38.693718408Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 01:06:38.697117 kubelet[2882]: E0123 01:06:38.697039 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:06:38.697339 kubelet[2882]: E0123 01:06:38.697142 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:06:38.697339 kubelet[2882]: E0123 01:06:38.697295 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:38.700124 containerd[1649]: time="2026-01-23T01:06:38.700067735Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 01:06:38.827498 systemd-networkd[1330]: vxlan.calico: Link UP Jan 23 01:06:38.827506 systemd-networkd[1330]: vxlan.calico: Gained carrier Jan 23 01:06:39.024571 kubelet[2882]: I0123 01:06:39.024529 2882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644b330e-04f5-4cea-ab8f-48f61da30d02" path="/var/lib/kubelet/pods/644b330e-04f5-4cea-ab8f-48f61da30d02/volumes" Jan 23 01:06:39.034019 containerd[1649]: time="2026-01-23T01:06:39.033788973Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:39.035908 containerd[1649]: time="2026-01-23T01:06:39.035873656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 01:06:39.036172 containerd[1649]: time="2026-01-23T01:06:39.035847587Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 01:06:39.036831 kubelet[2882]: E0123 01:06:39.036686 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:06:39.037034 kubelet[2882]: E0123 01:06:39.036799 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:06:39.037238 kubelet[2882]: E0123 01:06:39.037210 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:39.037486 kubelet[2882]: E0123 01:06:39.037381 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:06:39.077858 systemd-networkd[1330]: cali202f396e8a5: Gained IPv6LL Jan 23 01:06:39.199434 kubelet[2882]: E0123 01:06:39.199362 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:06:40.292911 systemd-networkd[1330]: vxlan.calico: Gained IPv6LL Jan 23 01:06:42.026702 containerd[1649]: time="2026-01-23T01:06:42.026604983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-hrfjp,Uid:ade066f3-57a6-4b59-8736-58fe8e8e36bc,Namespace:calico-apiserver,Attempt:0,}" Jan 23 01:06:42.166237 systemd-networkd[1330]: calie61e4055388: Link UP Jan 23 01:06:42.167931 systemd-networkd[1330]: calie61e4055388: Gained carrier Jan 23 01:06:42.185153 containerd[1649]: 2026-01-23 01:06:42.086 [INFO][4260] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0 calico-apiserver-57669cbdbb- calico-apiserver ade066f3-57a6-4b59-8736-58fe8e8e36bc 807 0 2026-01-23 01:06:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57669cbdbb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b calico-apiserver-57669cbdbb-hrfjp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie61e4055388 [] [] }} ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-" Jan 23 01:06:42.185153 containerd[1649]: 2026-01-23 01:06:42.086 [INFO][4260] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" Jan 23 01:06:42.185153 containerd[1649]: 2026-01-23 01:06:42.118 [INFO][4272] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" HandleID="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.119 [INFO][4272] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" HandleID="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf220), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-n-615049e46b", "pod":"calico-apiserver-57669cbdbb-hrfjp", "timestamp":"2026-01-23 01:06:42.118865606 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.119 [INFO][4272] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.119 [INFO][4272] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.119 [INFO][4272] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.134 [INFO][4272] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.138 [INFO][4272] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.144 [INFO][4272] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.145 [INFO][4272] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186394 containerd[1649]: 2026-01-23 01:06:42.147 [INFO][4272] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186724 containerd[1649]: 2026-01-23 01:06:42.147 [INFO][4272] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186724 containerd[1649]: 2026-01-23 01:06:42.148 [INFO][4272] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137 Jan 23 01:06:42.186724 containerd[1649]: 2026-01-23 01:06:42.153 [INFO][4272] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186724 containerd[1649]: 2026-01-23 01:06:42.159 [INFO][4272] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.194/26] block=192.168.65.192/26 handle="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186724 containerd[1649]: 2026-01-23 01:06:42.159 [INFO][4272] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.194/26] handle="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:42.186724 containerd[1649]: 2026-01-23 01:06:42.159 [INFO][4272] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:42.186724 containerd[1649]: 2026-01-23 01:06:42.159 [INFO][4272] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.194/26] IPv6=[] ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" HandleID="k8s-pod-network.f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" Jan 23 01:06:42.187705 containerd[1649]: 2026-01-23 01:06:42.161 [INFO][4260] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0", GenerateName:"calico-apiserver-57669cbdbb-", Namespace:"calico-apiserver", SelfLink:"", UID:"ade066f3-57a6-4b59-8736-58fe8e8e36bc", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57669cbdbb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"calico-apiserver-57669cbdbb-hrfjp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie61e4055388", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:42.187775 containerd[1649]: 2026-01-23 01:06:42.161 [INFO][4260] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.194/32] ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" Jan 23 01:06:42.187775 containerd[1649]: 2026-01-23 01:06:42.161 [INFO][4260] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie61e4055388 ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" Jan 23 01:06:42.187775 containerd[1649]: 2026-01-23 01:06:42.166 [INFO][4260] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" Jan 23 01:06:42.187833 containerd[1649]: 2026-01-23 01:06:42.169 [INFO][4260] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0", GenerateName:"calico-apiserver-57669cbdbb-", Namespace:"calico-apiserver", SelfLink:"", UID:"ade066f3-57a6-4b59-8736-58fe8e8e36bc", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57669cbdbb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137", Pod:"calico-apiserver-57669cbdbb-hrfjp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie61e4055388", MAC:"2e:2f:21:a7:0a:f8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:42.187885 containerd[1649]: 2026-01-23 01:06:42.180 [INFO][4260] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-hrfjp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--hrfjp-eth0" Jan 23 01:06:42.222913 containerd[1649]: time="2026-01-23T01:06:42.222823148Z" level=info msg="connecting to shim f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137" address="unix:///run/containerd/s/f0e412d6d625bf44dbfc6a5b6ca70b5e9a60a7c379092b82702c2de3aaf81512" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:42.243801 systemd[1]: Started cri-containerd-f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137.scope - libcontainer container f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137. Jan 23 01:06:42.289383 containerd[1649]: time="2026-01-23T01:06:42.289299875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-hrfjp,Uid:ade066f3-57a6-4b59-8736-58fe8e8e36bc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f4a3f01b7f0d21819c750ebaee6b69da5a9da3dc5d5a8eeda1f53d9788f8e137\"" Jan 23 01:06:42.292209 containerd[1649]: time="2026-01-23T01:06:42.292185719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:06:42.616258 containerd[1649]: time="2026-01-23T01:06:42.616118767Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:42.620277 containerd[1649]: time="2026-01-23T01:06:42.620185069Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:06:42.620502 containerd[1649]: time="2026-01-23T01:06:42.620349810Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:06:42.620782 kubelet[2882]: E0123 01:06:42.620666 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:42.621418 kubelet[2882]: E0123 01:06:42.620787 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:42.621418 kubelet[2882]: E0123 01:06:42.620951 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-hrfjp_calico-apiserver(ade066f3-57a6-4b59-8736-58fe8e8e36bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:42.621418 kubelet[2882]: E0123 01:06:42.621027 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:06:43.032070 containerd[1649]: time="2026-01-23T01:06:43.031768034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-twdcc,Uid:2bf6baca-f998-4b11-9305-bcbec3ac4e25,Namespace:kube-system,Attempt:0,}" Jan 23 01:06:43.037058 containerd[1649]: time="2026-01-23T01:06:43.036931025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-pnljq,Uid:38be9c86-8462-40da-b6c5-51dc537715d5,Namespace:calico-apiserver,Attempt:0,}" Jan 23 01:06:43.211901 kubelet[2882]: E0123 01:06:43.211856 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:06:43.239735 systemd-networkd[1330]: cali7fcb2d1a353: Link UP Jan 23 01:06:43.240579 systemd-networkd[1330]: cali7fcb2d1a353: Gained carrier Jan 23 01:06:43.264089 containerd[1649]: 2026-01-23 01:06:43.134 [INFO][4344] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0 calico-apiserver-57669cbdbb- calico-apiserver 38be9c86-8462-40da-b6c5-51dc537715d5 806 0 2026-01-23 01:06:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57669cbdbb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b calico-apiserver-57669cbdbb-pnljq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7fcb2d1a353 [] [] }} ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-" Jan 23 01:06:43.264089 containerd[1649]: 2026-01-23 01:06:43.134 [INFO][4344] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" Jan 23 01:06:43.264089 containerd[1649]: 2026-01-23 01:06:43.186 [INFO][4358] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" HandleID="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.186 [INFO][4358] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" HandleID="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f8b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-n-615049e46b", "pod":"calico-apiserver-57669cbdbb-pnljq", "timestamp":"2026-01-23 01:06:43.186773951 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.187 [INFO][4358] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.187 [INFO][4358] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.187 [INFO][4358] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.193 [INFO][4358] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.199 [INFO][4358] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.203 [INFO][4358] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.205 [INFO][4358] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264295 containerd[1649]: 2026-01-23 01:06:43.209 [INFO][4358] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264868 containerd[1649]: 2026-01-23 01:06:43.209 [INFO][4358] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264868 containerd[1649]: 2026-01-23 01:06:43.213 [INFO][4358] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98 Jan 23 01:06:43.264868 containerd[1649]: 2026-01-23 01:06:43.223 [INFO][4358] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264868 containerd[1649]: 2026-01-23 01:06:43.229 [INFO][4358] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.195/26] block=192.168.65.192/26 handle="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264868 containerd[1649]: 2026-01-23 01:06:43.229 [INFO][4358] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.195/26] handle="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.264868 containerd[1649]: 2026-01-23 01:06:43.229 [INFO][4358] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:43.264868 containerd[1649]: 2026-01-23 01:06:43.230 [INFO][4358] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.195/26] IPv6=[] ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" HandleID="k8s-pod-network.8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" Jan 23 01:06:43.265318 containerd[1649]: 2026-01-23 01:06:43.232 [INFO][4344] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0", GenerateName:"calico-apiserver-57669cbdbb-", Namespace:"calico-apiserver", SelfLink:"", UID:"38be9c86-8462-40da-b6c5-51dc537715d5", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57669cbdbb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"calico-apiserver-57669cbdbb-pnljq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7fcb2d1a353", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:43.265394 containerd[1649]: 2026-01-23 01:06:43.232 [INFO][4344] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.195/32] ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" Jan 23 01:06:43.265394 containerd[1649]: 2026-01-23 01:06:43.233 [INFO][4344] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7fcb2d1a353 ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" Jan 23 01:06:43.265394 containerd[1649]: 2026-01-23 01:06:43.241 [INFO][4344] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" Jan 23 01:06:43.265457 containerd[1649]: 2026-01-23 01:06:43.241 [INFO][4344] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0", GenerateName:"calico-apiserver-57669cbdbb-", Namespace:"calico-apiserver", SelfLink:"", UID:"38be9c86-8462-40da-b6c5-51dc537715d5", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57669cbdbb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98", Pod:"calico-apiserver-57669cbdbb-pnljq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7fcb2d1a353", MAC:"b2:f2:7d:ca:f6:cd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:43.265507 containerd[1649]: 2026-01-23 01:06:43.262 [INFO][4344] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" Namespace="calico-apiserver" Pod="calico-apiserver-57669cbdbb-pnljq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--apiserver--57669cbdbb--pnljq-eth0" Jan 23 01:06:43.325919 systemd-networkd[1330]: calidf5d974f6a7: Link UP Jan 23 01:06:43.327152 systemd-networkd[1330]: calidf5d974f6a7: Gained carrier Jan 23 01:06:43.344879 containerd[1649]: 2026-01-23 01:06:43.150 [INFO][4333] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0 coredns-66bc5c9577- kube-system 2bf6baca-f998-4b11-9305-bcbec3ac4e25 803 0 2026-01-23 01:06:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b coredns-66bc5c9577-twdcc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calidf5d974f6a7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-" Jan 23 01:06:43.344879 containerd[1649]: 2026-01-23 01:06:43.150 [INFO][4333] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" Jan 23 01:06:43.344879 containerd[1649]: 2026-01-23 01:06:43.197 [INFO][4363] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" HandleID="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Workload="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.197 [INFO][4363] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" HandleID="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Workload="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-n-615049e46b", "pod":"coredns-66bc5c9577-twdcc", "timestamp":"2026-01-23 01:06:43.197566532 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.197 [INFO][4363] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.229 [INFO][4363] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.229 [INFO][4363] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.294 [INFO][4363] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.299 [INFO][4363] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.303 [INFO][4363] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.305 [INFO][4363] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346562 containerd[1649]: 2026-01-23 01:06:43.307 [INFO][4363] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346794 containerd[1649]: 2026-01-23 01:06:43.307 [INFO][4363] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346794 containerd[1649]: 2026-01-23 01:06:43.308 [INFO][4363] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1 Jan 23 01:06:43.346794 containerd[1649]: 2026-01-23 01:06:43.315 [INFO][4363] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346794 containerd[1649]: 2026-01-23 01:06:43.320 [INFO][4363] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.196/26] block=192.168.65.192/26 handle="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346794 containerd[1649]: 2026-01-23 01:06:43.320 [INFO][4363] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.196/26] handle="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:43.346794 containerd[1649]: 2026-01-23 01:06:43.320 [INFO][4363] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:43.346794 containerd[1649]: 2026-01-23 01:06:43.320 [INFO][4363] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.196/26] IPv6=[] ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" HandleID="k8s-pod-network.8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Workload="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" Jan 23 01:06:43.346947 containerd[1649]: 2026-01-23 01:06:43.322 [INFO][4333] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2bf6baca-f998-4b11-9305-bcbec3ac4e25", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"coredns-66bc5c9577-twdcc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidf5d974f6a7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:43.346947 containerd[1649]: 2026-01-23 01:06:43.322 [INFO][4333] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.196/32] ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" Jan 23 01:06:43.346947 containerd[1649]: 2026-01-23 01:06:43.322 [INFO][4333] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidf5d974f6a7 ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" Jan 23 01:06:43.346947 containerd[1649]: 2026-01-23 01:06:43.326 [INFO][4333] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" Jan 23 01:06:43.346947 containerd[1649]: 2026-01-23 01:06:43.327 [INFO][4333] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2bf6baca-f998-4b11-9305-bcbec3ac4e25", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1", Pod:"coredns-66bc5c9577-twdcc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidf5d974f6a7", MAC:"76:eb:ee:4f:bc:fb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:43.347128 containerd[1649]: 2026-01-23 01:06:43.338 [INFO][4333] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" Namespace="kube-system" Pod="coredns-66bc5c9577-twdcc" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--twdcc-eth0" Jan 23 01:06:43.436143 containerd[1649]: time="2026-01-23T01:06:43.436102588Z" level=info msg="connecting to shim 8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98" address="unix:///run/containerd/s/bf6f4b90e754a53a531c74949525e09b468b8ad5fd06e6f8bb2f0574aaf3a0ba" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:43.442635 containerd[1649]: time="2026-01-23T01:06:43.442606274Z" level=info msg="connecting to shim 8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1" address="unix:///run/containerd/s/d22a5204eeef118641155a71dad200b58a589057ad199ff88caab23638f08365" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:43.476788 systemd[1]: Started cri-containerd-8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98.scope - libcontainer container 8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98. Jan 23 01:06:43.481653 systemd[1]: Started cri-containerd-8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1.scope - libcontainer container 8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1. Jan 23 01:06:43.535880 containerd[1649]: time="2026-01-23T01:06:43.535838650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-twdcc,Uid:2bf6baca-f998-4b11-9305-bcbec3ac4e25,Namespace:kube-system,Attempt:0,} returns sandbox id \"8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1\"" Jan 23 01:06:43.543537 containerd[1649]: time="2026-01-23T01:06:43.543303065Z" level=info msg="CreateContainer within sandbox \"8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 01:06:43.550336 containerd[1649]: time="2026-01-23T01:06:43.550304390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57669cbdbb-pnljq,Uid:38be9c86-8462-40da-b6c5-51dc537715d5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8be6a197c8414cb105f77718a5916e9278aec5324d077b1c3e0f732c06a48a98\"" Jan 23 01:06:43.551406 containerd[1649]: time="2026-01-23T01:06:43.551286301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:06:43.563724 containerd[1649]: time="2026-01-23T01:06:43.563705424Z" level=info msg="Container 8503594fd9ff13390396691282bd93baee85485eee72d2c28501a32e0aca8f90: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:43.571212 containerd[1649]: time="2026-01-23T01:06:43.571147354Z" level=info msg="CreateContainer within sandbox \"8c4bb4533b8f194158a2fccb03b00f1697cbe91baafd53c3667778d654cfb9e1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8503594fd9ff13390396691282bd93baee85485eee72d2c28501a32e0aca8f90\"" Jan 23 01:06:43.572030 containerd[1649]: time="2026-01-23T01:06:43.572008754Z" level=info msg="StartContainer for \"8503594fd9ff13390396691282bd93baee85485eee72d2c28501a32e0aca8f90\"" Jan 23 01:06:43.572685 containerd[1649]: time="2026-01-23T01:06:43.572662778Z" level=info msg="connecting to shim 8503594fd9ff13390396691282bd93baee85485eee72d2c28501a32e0aca8f90" address="unix:///run/containerd/s/d22a5204eeef118641155a71dad200b58a589057ad199ff88caab23638f08365" protocol=ttrpc version=3 Jan 23 01:06:43.588654 systemd[1]: Started cri-containerd-8503594fd9ff13390396691282bd93baee85485eee72d2c28501a32e0aca8f90.scope - libcontainer container 8503594fd9ff13390396691282bd93baee85485eee72d2c28501a32e0aca8f90. Jan 23 01:06:43.613918 containerd[1649]: time="2026-01-23T01:06:43.613877638Z" level=info msg="StartContainer for \"8503594fd9ff13390396691282bd93baee85485eee72d2c28501a32e0aca8f90\" returns successfully" Jan 23 01:06:43.684920 systemd-networkd[1330]: calie61e4055388: Gained IPv6LL Jan 23 01:06:43.868827 containerd[1649]: time="2026-01-23T01:06:43.868611152Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:43.872012 containerd[1649]: time="2026-01-23T01:06:43.871922346Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:06:43.872458 containerd[1649]: time="2026-01-23T01:06:43.872015491Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:06:43.872937 kubelet[2882]: E0123 01:06:43.872847 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:43.873673 kubelet[2882]: E0123 01:06:43.872945 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:43.873673 kubelet[2882]: E0123 01:06:43.873108 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-pnljq_calico-apiserver(38be9c86-8462-40da-b6c5-51dc537715d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:43.873673 kubelet[2882]: E0123 01:06:43.873178 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:06:44.026943 containerd[1649]: time="2026-01-23T01:06:44.026835876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-m6g24,Uid:175eaffb-3ffd-4a3f-999d-7e0a67b54b14,Namespace:kube-system,Attempt:0,}" Jan 23 01:06:44.031228 containerd[1649]: time="2026-01-23T01:06:44.031159406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd6c7ccb7-hrmnq,Uid:f7eac782-ccd6-467d-a1a8-1d1f5f096853,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:44.220524 kubelet[2882]: E0123 01:06:44.220320 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:06:44.221580 kubelet[2882]: E0123 01:06:44.220571 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:06:44.227737 systemd-networkd[1330]: cali2e1f496eb1f: Link UP Jan 23 01:06:44.229023 systemd-networkd[1330]: cali2e1f496eb1f: Gained carrier Jan 23 01:06:44.240770 kubelet[2882]: I0123 01:06:44.240734 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-twdcc" podStartSLOduration=40.240692062 podStartE2EDuration="40.240692062s" podCreationTimestamp="2026-01-23 01:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 01:06:44.239344353 +0000 UTC m=+45.311969931" watchObservedRunningTime="2026-01-23 01:06:44.240692062 +0000 UTC m=+45.313317632" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.133 [INFO][4520] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0 calico-kube-controllers-7cd6c7ccb7- calico-system f7eac782-ccd6-467d-a1a8-1d1f5f096853 804 0 2026-01-23 01:06:18 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7cd6c7ccb7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b calico-kube-controllers-7cd6c7ccb7-hrmnq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2e1f496eb1f [] [] }} ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.134 [INFO][4520] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.177 [INFO][4550] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" HandleID="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.178 [INFO][4550] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" HandleID="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd5a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-615049e46b", "pod":"calico-kube-controllers-7cd6c7ccb7-hrmnq", "timestamp":"2026-01-23 01:06:44.177260189 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.178 [INFO][4550] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.179 [INFO][4550] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.179 [INFO][4550] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.187 [INFO][4550] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.192 [INFO][4550] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.197 [INFO][4550] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.199 [INFO][4550] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.201 [INFO][4550] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.201 [INFO][4550] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.202 [INFO][4550] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052 Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.206 [INFO][4550] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.216 [INFO][4550] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.197/26] block=192.168.65.192/26 handle="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.216 [INFO][4550] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.197/26] handle="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.216 [INFO][4550] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:44.247067 containerd[1649]: 2026-01-23 01:06:44.216 [INFO][4550] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.197/26] IPv6=[] ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" HandleID="k8s-pod-network.821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Workload="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" Jan 23 01:06:44.249556 containerd[1649]: 2026-01-23 01:06:44.222 [INFO][4520] cni-plugin/k8s.go 418: Populated endpoint ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0", GenerateName:"calico-kube-controllers-7cd6c7ccb7-", Namespace:"calico-system", SelfLink:"", UID:"f7eac782-ccd6-467d-a1a8-1d1f5f096853", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cd6c7ccb7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"calico-kube-controllers-7cd6c7ccb7-hrmnq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.65.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2e1f496eb1f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:44.249556 containerd[1649]: 2026-01-23 01:06:44.222 [INFO][4520] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.197/32] ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" Jan 23 01:06:44.249556 containerd[1649]: 2026-01-23 01:06:44.222 [INFO][4520] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2e1f496eb1f ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" Jan 23 01:06:44.249556 containerd[1649]: 2026-01-23 01:06:44.230 [INFO][4520] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" Jan 23 01:06:44.249556 containerd[1649]: 2026-01-23 01:06:44.230 [INFO][4520] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0", GenerateName:"calico-kube-controllers-7cd6c7ccb7-", Namespace:"calico-system", SelfLink:"", UID:"f7eac782-ccd6-467d-a1a8-1d1f5f096853", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cd6c7ccb7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052", Pod:"calico-kube-controllers-7cd6c7ccb7-hrmnq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.65.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2e1f496eb1f", MAC:"9e:6c:d8:00:9f:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:44.249556 containerd[1649]: 2026-01-23 01:06:44.243 [INFO][4520] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" Namespace="calico-system" Pod="calico-kube-controllers-7cd6c7ccb7-hrmnq" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-calico--kube--controllers--7cd6c7ccb7--hrmnq-eth0" Jan 23 01:06:44.283013 containerd[1649]: time="2026-01-23T01:06:44.282974105Z" level=info msg="connecting to shim 821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052" address="unix:///run/containerd/s/1f763e429b406947c512ea17438947f8d0578d414f990e395d6985c536277a6f" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:44.328656 systemd[1]: Started cri-containerd-821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052.scope - libcontainer container 821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052. Jan 23 01:06:44.347743 systemd-networkd[1330]: cali54cb06a5d91: Link UP Jan 23 01:06:44.347870 systemd-networkd[1330]: cali54cb06a5d91: Gained carrier Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.125 [INFO][4519] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0 coredns-66bc5c9577- kube-system 175eaffb-3ffd-4a3f-999d-7e0a67b54b14 805 0 2026-01-23 01:06:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b coredns-66bc5c9577-m6g24 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali54cb06a5d91 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.129 [INFO][4519] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.181 [INFO][4549] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" HandleID="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Workload="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.181 [INFO][4549] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" HandleID="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Workload="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5100), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-n-615049e46b", "pod":"coredns-66bc5c9577-m6g24", "timestamp":"2026-01-23 01:06:44.181757847 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.181 [INFO][4549] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.216 [INFO][4549] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.216 [INFO][4549] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.289 [INFO][4549] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.309 [INFO][4549] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.321 [INFO][4549] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.324 [INFO][4549] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.327 [INFO][4549] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.327 [INFO][4549] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.329 [INFO][4549] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003 Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.334 [INFO][4549] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.341 [INFO][4549] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.198/26] block=192.168.65.192/26 handle="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.341 [INFO][4549] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.198/26] handle="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.341 [INFO][4549] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:44.366153 containerd[1649]: 2026-01-23 01:06:44.341 [INFO][4549] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.198/26] IPv6=[] ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" HandleID="k8s-pod-network.c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Workload="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" Jan 23 01:06:44.366693 containerd[1649]: 2026-01-23 01:06:44.343 [INFO][4519] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"175eaffb-3ffd-4a3f-999d-7e0a67b54b14", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"coredns-66bc5c9577-m6g24", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali54cb06a5d91", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:44.366693 containerd[1649]: 2026-01-23 01:06:44.344 [INFO][4519] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.198/32] ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" Jan 23 01:06:44.366693 containerd[1649]: 2026-01-23 01:06:44.344 [INFO][4519] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali54cb06a5d91 ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" Jan 23 01:06:44.366693 containerd[1649]: 2026-01-23 01:06:44.349 [INFO][4519] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" Jan 23 01:06:44.366693 containerd[1649]: 2026-01-23 01:06:44.350 [INFO][4519] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"175eaffb-3ffd-4a3f-999d-7e0a67b54b14", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003", Pod:"coredns-66bc5c9577-m6g24", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali54cb06a5d91", MAC:"4a:88:b6:68:78:27", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:44.366860 containerd[1649]: 2026-01-23 01:06:44.363 [INFO][4519] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" Namespace="kube-system" Pod="coredns-66bc5c9577-m6g24" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-coredns--66bc5c9577--m6g24-eth0" Jan 23 01:06:44.395641 containerd[1649]: time="2026-01-23T01:06:44.394888358Z" level=info msg="connecting to shim c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003" address="unix:///run/containerd/s/ed446910ffef84280db577f5ca10d9b4cfc6b6a9eb53526cdb6711012c00d56b" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:44.428756 systemd[1]: Started cri-containerd-c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003.scope - libcontainer container c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003. Jan 23 01:06:44.430631 containerd[1649]: time="2026-01-23T01:06:44.429480099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd6c7ccb7-hrmnq,Uid:f7eac782-ccd6-467d-a1a8-1d1f5f096853,Namespace:calico-system,Attempt:0,} returns sandbox id \"821cdb5487f6ac017f7a841d30a5363d89048409a9398684fdb8f88dfb507052\"" Jan 23 01:06:44.432021 containerd[1649]: time="2026-01-23T01:06:44.431925845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 01:06:44.477458 containerd[1649]: time="2026-01-23T01:06:44.477379835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-m6g24,Uid:175eaffb-3ffd-4a3f-999d-7e0a67b54b14,Namespace:kube-system,Attempt:0,} returns sandbox id \"c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003\"" Jan 23 01:06:44.483904 containerd[1649]: time="2026-01-23T01:06:44.483880444Z" level=info msg="CreateContainer within sandbox \"c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 01:06:44.498345 containerd[1649]: time="2026-01-23T01:06:44.498311705Z" level=info msg="Container aec5f1ef4851502abf80209cf5c8bed3202b9c46fc79c0a556a87dfd5a4d7c4f: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:06:44.506573 containerd[1649]: time="2026-01-23T01:06:44.506543156Z" level=info msg="CreateContainer within sandbox \"c94e749b20774ac8f5e35b24800d51661937e0cd2886e371fc9df038dc359003\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"aec5f1ef4851502abf80209cf5c8bed3202b9c46fc79c0a556a87dfd5a4d7c4f\"" Jan 23 01:06:44.508839 containerd[1649]: time="2026-01-23T01:06:44.508682965Z" level=info msg="StartContainer for \"aec5f1ef4851502abf80209cf5c8bed3202b9c46fc79c0a556a87dfd5a4d7c4f\"" Jan 23 01:06:44.511806 containerd[1649]: time="2026-01-23T01:06:44.511469400Z" level=info msg="connecting to shim aec5f1ef4851502abf80209cf5c8bed3202b9c46fc79c0a556a87dfd5a4d7c4f" address="unix:///run/containerd/s/ed446910ffef84280db577f5ca10d9b4cfc6b6a9eb53526cdb6711012c00d56b" protocol=ttrpc version=3 Jan 23 01:06:44.531646 systemd[1]: Started cri-containerd-aec5f1ef4851502abf80209cf5c8bed3202b9c46fc79c0a556a87dfd5a4d7c4f.scope - libcontainer container aec5f1ef4851502abf80209cf5c8bed3202b9c46fc79c0a556a87dfd5a4d7c4f. Jan 23 01:06:44.556810 containerd[1649]: time="2026-01-23T01:06:44.556782310Z" level=info msg="StartContainer for \"aec5f1ef4851502abf80209cf5c8bed3202b9c46fc79c0a556a87dfd5a4d7c4f\" returns successfully" Jan 23 01:06:44.762699 containerd[1649]: time="2026-01-23T01:06:44.762059375Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:44.764042 containerd[1649]: time="2026-01-23T01:06:44.763948931Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 01:06:44.764042 containerd[1649]: time="2026-01-23T01:06:44.763957017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 01:06:44.764222 kubelet[2882]: E0123 01:06:44.764191 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:06:44.764308 kubelet[2882]: E0123 01:06:44.764245 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:06:44.764522 kubelet[2882]: E0123 01:06:44.764331 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7cd6c7ccb7-hrmnq_calico-system(f7eac782-ccd6-467d-a1a8-1d1f5f096853): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:44.764522 kubelet[2882]: E0123 01:06:44.764363 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:06:44.772721 systemd-networkd[1330]: calidf5d974f6a7: Gained IPv6LL Jan 23 01:06:44.964885 systemd-networkd[1330]: cali7fcb2d1a353: Gained IPv6LL Jan 23 01:06:45.236731 kubelet[2882]: E0123 01:06:45.236642 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:06:45.240085 kubelet[2882]: E0123 01:06:45.240015 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:06:45.298217 kubelet[2882]: I0123 01:06:45.298158 2882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-m6g24" podStartSLOduration=41.298142031 podStartE2EDuration="41.298142031s" podCreationTimestamp="2026-01-23 01:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 01:06:45.262626217 +0000 UTC m=+46.335251850" watchObservedRunningTime="2026-01-23 01:06:45.298142031 +0000 UTC m=+46.370767629" Jan 23 01:06:45.733164 systemd-networkd[1330]: cali2e1f496eb1f: Gained IPv6LL Jan 23 01:06:45.988949 systemd-networkd[1330]: cali54cb06a5d91: Gained IPv6LL Jan 23 01:06:46.028219 containerd[1649]: time="2026-01-23T01:06:46.028130530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p7kwl,Uid:bfc1d39e-fed0-4ad0-8a64-aa0c649c314e,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:46.030856 containerd[1649]: time="2026-01-23T01:06:46.030770136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sx6dp,Uid:abca6cb1-d45d-4716-90fc-9fea5bf2bb4c,Namespace:calico-system,Attempt:0,}" Jan 23 01:06:46.187425 systemd-networkd[1330]: calid4cd7e8c0c8: Link UP Jan 23 01:06:46.188306 systemd-networkd[1330]: calid4cd7e8c0c8: Gained carrier Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.107 [INFO][4721] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0 goldmane-7c778bb748- calico-system abca6cb1-d45d-4716-90fc-9fea5bf2bb4c 808 0 2026-01-23 01:06:16 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b goldmane-7c778bb748-sx6dp eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid4cd7e8c0c8 [] [] }} ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.107 [INFO][4721] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.138 [INFO][4738] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" HandleID="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Workload="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.138 [INFO][4738] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" HandleID="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Workload="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f6f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-615049e46b", "pod":"goldmane-7c778bb748-sx6dp", "timestamp":"2026-01-23 01:06:46.138579997 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.138 [INFO][4738] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.138 [INFO][4738] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.138 [INFO][4738] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.147 [INFO][4738] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.153 [INFO][4738] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.159 [INFO][4738] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.161 [INFO][4738] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.164 [INFO][4738] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.164 [INFO][4738] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.166 [INFO][4738] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.173 [INFO][4738] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.180 [INFO][4738] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.199/26] block=192.168.65.192/26 handle="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.180 [INFO][4738] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.199/26] handle="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.180 [INFO][4738] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:46.205672 containerd[1649]: 2026-01-23 01:06:46.180 [INFO][4738] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.199/26] IPv6=[] ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" HandleID="k8s-pod-network.cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Workload="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" Jan 23 01:06:46.207076 containerd[1649]: 2026-01-23 01:06:46.183 [INFO][4721] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"abca6cb1-d45d-4716-90fc-9fea5bf2bb4c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"goldmane-7c778bb748-sx6dp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.65.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid4cd7e8c0c8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:46.207076 containerd[1649]: 2026-01-23 01:06:46.183 [INFO][4721] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.199/32] ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" Jan 23 01:06:46.207076 containerd[1649]: 2026-01-23 01:06:46.183 [INFO][4721] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid4cd7e8c0c8 ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" Jan 23 01:06:46.207076 containerd[1649]: 2026-01-23 01:06:46.189 [INFO][4721] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" Jan 23 01:06:46.207076 containerd[1649]: 2026-01-23 01:06:46.189 [INFO][4721] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"abca6cb1-d45d-4716-90fc-9fea5bf2bb4c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a", Pod:"goldmane-7c778bb748-sx6dp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.65.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid4cd7e8c0c8", MAC:"7a:30:12:eb:d8:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:46.207076 containerd[1649]: 2026-01-23 01:06:46.203 [INFO][4721] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" Namespace="calico-system" Pod="goldmane-7c778bb748-sx6dp" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-goldmane--7c778bb748--sx6dp-eth0" Jan 23 01:06:46.237438 containerd[1649]: time="2026-01-23T01:06:46.235658071Z" level=info msg="connecting to shim cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a" address="unix:///run/containerd/s/bc3dd2a667808204fda58519b71728593aa672d7082d96336d3d5676b5f3f336" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:46.243835 kubelet[2882]: E0123 01:06:46.243754 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:06:46.279762 systemd[1]: Started cri-containerd-cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a.scope - libcontainer container cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a. Jan 23 01:06:46.314245 systemd-networkd[1330]: calif6954f7550c: Link UP Jan 23 01:06:46.316898 systemd-networkd[1330]: calif6954f7550c: Gained carrier Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.110 [INFO][4712] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0 csi-node-driver- calico-system bfc1d39e-fed0-4ad0-8a64-aa0c649c314e 695 0 2026-01-23 01:06:18 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-n-615049e46b csi-node-driver-p7kwl eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif6954f7550c [] [] }} ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.111 [INFO][4712] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.148 [INFO][4743] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" HandleID="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Workload="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.148 [INFO][4743] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" HandleID="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Workload="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-615049e46b", "pod":"csi-node-driver-p7kwl", "timestamp":"2026-01-23 01:06:46.148354844 +0000 UTC"}, Hostname:"ci-4459-2-2-n-615049e46b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.149 [INFO][4743] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.180 [INFO][4743] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.180 [INFO][4743] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-615049e46b' Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.250 [INFO][4743] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.267 [INFO][4743] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.273 [INFO][4743] ipam/ipam.go 511: Trying affinity for 192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.279 [INFO][4743] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.283 [INFO][4743] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.192/26 host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.283 [INFO][4743] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.65.192/26 handle="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.289 [INFO][4743] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769 Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.300 [INFO][4743] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.65.192/26 handle="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.307 [INFO][4743] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.65.200/26] block=192.168.65.192/26 handle="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.307 [INFO][4743] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.200/26] handle="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" host="ci-4459-2-2-n-615049e46b" Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.307 [INFO][4743] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 01:06:46.333635 containerd[1649]: 2026-01-23 01:06:46.307 [INFO][4743] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.65.200/26] IPv6=[] ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" HandleID="k8s-pod-network.d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Workload="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" Jan 23 01:06:46.334131 containerd[1649]: 2026-01-23 01:06:46.309 [INFO][4712] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bfc1d39e-fed0-4ad0-8a64-aa0c649c314e", ResourceVersion:"695", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"", Pod:"csi-node-driver-p7kwl", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.65.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif6954f7550c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:46.334131 containerd[1649]: 2026-01-23 01:06:46.310 [INFO][4712] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.200/32] ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" Jan 23 01:06:46.334131 containerd[1649]: 2026-01-23 01:06:46.310 [INFO][4712] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6954f7550c ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" Jan 23 01:06:46.334131 containerd[1649]: 2026-01-23 01:06:46.317 [INFO][4712] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" Jan 23 01:06:46.334131 containerd[1649]: 2026-01-23 01:06:46.318 [INFO][4712] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bfc1d39e-fed0-4ad0-8a64-aa0c649c314e", ResourceVersion:"695", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 1, 6, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-615049e46b", ContainerID:"d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769", Pod:"csi-node-driver-p7kwl", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.65.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif6954f7550c", MAC:"32:19:06:58:99:c9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 01:06:46.334131 containerd[1649]: 2026-01-23 01:06:46.330 [INFO][4712] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" Namespace="calico-system" Pod="csi-node-driver-p7kwl" WorkloadEndpoint="ci--4459--2--2--n--615049e46b-k8s-csi--node--driver--p7kwl-eth0" Jan 23 01:06:46.369158 containerd[1649]: time="2026-01-23T01:06:46.369120568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sx6dp,Uid:abca6cb1-d45d-4716-90fc-9fea5bf2bb4c,Namespace:calico-system,Attempt:0,} returns sandbox id \"cd7aeecc5ffd4141dfc67dcc3da2be0df07e3f80e6afdb4b400488b175b3bc5a\"" Jan 23 01:06:46.373264 containerd[1649]: time="2026-01-23T01:06:46.373205992Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 01:06:46.379350 containerd[1649]: time="2026-01-23T01:06:46.379299994Z" level=info msg="connecting to shim d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769" address="unix:///run/containerd/s/466b99a52ec657118ad637d5461f094850dd443a0a73557a7d202e0711b2f2a6" namespace=k8s.io protocol=ttrpc version=3 Jan 23 01:06:46.406746 systemd[1]: Started cri-containerd-d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769.scope - libcontainer container d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769. Jan 23 01:06:46.435302 containerd[1649]: time="2026-01-23T01:06:46.435221889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p7kwl,Uid:bfc1d39e-fed0-4ad0-8a64-aa0c649c314e,Namespace:calico-system,Attempt:0,} returns sandbox id \"d4f2e09b491f4f6fcf6b8b2acaf4fee3e14e5bb11f00c146cfb3bf16470df769\"" Jan 23 01:06:46.709810 containerd[1649]: time="2026-01-23T01:06:46.709723714Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:46.711812 containerd[1649]: time="2026-01-23T01:06:46.711738008Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 01:06:46.711965 containerd[1649]: time="2026-01-23T01:06:46.711902741Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 01:06:46.712312 kubelet[2882]: E0123 01:06:46.712206 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:06:46.712312 kubelet[2882]: E0123 01:06:46.712290 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:06:46.713070 kubelet[2882]: E0123 01:06:46.712881 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sx6dp_calico-system(abca6cb1-d45d-4716-90fc-9fea5bf2bb4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:46.713070 kubelet[2882]: E0123 01:06:46.712965 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:06:46.713879 containerd[1649]: time="2026-01-23T01:06:46.713831714Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 01:06:47.067568 containerd[1649]: time="2026-01-23T01:06:47.067357068Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:47.070159 containerd[1649]: time="2026-01-23T01:06:47.070045652Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 01:06:47.070284 containerd[1649]: time="2026-01-23T01:06:47.070075015Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 01:06:47.070810 kubelet[2882]: E0123 01:06:47.070689 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:06:47.071094 kubelet[2882]: E0123 01:06:47.070901 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:06:47.071432 kubelet[2882]: E0123 01:06:47.071403 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:47.075816 containerd[1649]: time="2026-01-23T01:06:47.075780561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 01:06:47.240931 kubelet[2882]: E0123 01:06:47.240774 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:06:47.415555 containerd[1649]: time="2026-01-23T01:06:47.415132991Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:47.419642 containerd[1649]: time="2026-01-23T01:06:47.419198038Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 01:06:47.419642 containerd[1649]: time="2026-01-23T01:06:47.419362700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 01:06:47.421105 kubelet[2882]: E0123 01:06:47.420870 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:06:47.423195 kubelet[2882]: E0123 01:06:47.421063 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:06:47.423195 kubelet[2882]: E0123 01:06:47.422667 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:47.423195 kubelet[2882]: E0123 01:06:47.422743 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:47.973027 systemd-networkd[1330]: calif6954f7550c: Gained IPv6LL Jan 23 01:06:48.165380 systemd-networkd[1330]: calid4cd7e8c0c8: Gained IPv6LL Jan 23 01:06:48.245157 kubelet[2882]: E0123 01:06:48.245013 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:06:48.246262 kubelet[2882]: E0123 01:06:48.246209 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:06:54.024837 containerd[1649]: time="2026-01-23T01:06:54.024746793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 01:06:54.452183 containerd[1649]: time="2026-01-23T01:06:54.452083045Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:54.454904 containerd[1649]: time="2026-01-23T01:06:54.454751262Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 01:06:54.454904 containerd[1649]: time="2026-01-23T01:06:54.454843097Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 01:06:54.455666 kubelet[2882]: E0123 01:06:54.455161 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:06:54.455666 kubelet[2882]: E0123 01:06:54.455216 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:06:54.455666 kubelet[2882]: E0123 01:06:54.455310 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:54.458452 containerd[1649]: time="2026-01-23T01:06:54.457250011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 01:06:54.807534 containerd[1649]: time="2026-01-23T01:06:54.806979385Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:54.810301 containerd[1649]: time="2026-01-23T01:06:54.810193587Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 01:06:54.810301 containerd[1649]: time="2026-01-23T01:06:54.810234496Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 01:06:54.810697 kubelet[2882]: E0123 01:06:54.810649 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:06:54.810797 kubelet[2882]: E0123 01:06:54.810704 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:06:54.810855 kubelet[2882]: E0123 01:06:54.810808 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:54.810895 kubelet[2882]: E0123 01:06:54.810860 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:06:57.025637 containerd[1649]: time="2026-01-23T01:06:57.024609835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:06:57.373961 containerd[1649]: time="2026-01-23T01:06:57.373836437Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:57.376200 containerd[1649]: time="2026-01-23T01:06:57.376130077Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:06:57.376388 containerd[1649]: time="2026-01-23T01:06:57.376220931Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:06:57.376770 kubelet[2882]: E0123 01:06:57.376688 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:57.376770 kubelet[2882]: E0123 01:06:57.376765 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:57.378370 kubelet[2882]: E0123 01:06:57.376871 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-hrfjp_calico-apiserver(ade066f3-57a6-4b59-8736-58fe8e8e36bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:57.378370 kubelet[2882]: E0123 01:06:57.376920 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:06:58.025221 containerd[1649]: time="2026-01-23T01:06:58.024993552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:06:58.369573 containerd[1649]: time="2026-01-23T01:06:58.369223455Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:06:58.371634 containerd[1649]: time="2026-01-23T01:06:58.371425324Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:06:58.371806 containerd[1649]: time="2026-01-23T01:06:58.371583006Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:06:58.371994 kubelet[2882]: E0123 01:06:58.371883 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:58.371994 kubelet[2882]: E0123 01:06:58.371965 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:06:58.372455 kubelet[2882]: E0123 01:06:58.372128 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-pnljq_calico-apiserver(38be9c86-8462-40da-b6c5-51dc537715d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:06:58.372455 kubelet[2882]: E0123 01:06:58.372201 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:07:00.025622 containerd[1649]: time="2026-01-23T01:07:00.025026131Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 01:07:00.373448 containerd[1649]: time="2026-01-23T01:07:00.373352726Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:00.375558 containerd[1649]: time="2026-01-23T01:07:00.375342918Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 01:07:00.375558 containerd[1649]: time="2026-01-23T01:07:00.375411336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 01:07:00.376544 kubelet[2882]: E0123 01:07:00.376426 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:07:00.377817 kubelet[2882]: E0123 01:07:00.377270 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:07:00.377817 kubelet[2882]: E0123 01:07:00.377583 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7cd6c7ccb7-hrmnq_calico-system(f7eac782-ccd6-467d-a1a8-1d1f5f096853): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:00.377817 kubelet[2882]: E0123 01:07:00.377690 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:07:01.027552 containerd[1649]: time="2026-01-23T01:07:01.025883155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 01:07:01.373065 containerd[1649]: time="2026-01-23T01:07:01.372999014Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:01.375209 containerd[1649]: time="2026-01-23T01:07:01.375139238Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 01:07:01.375370 containerd[1649]: time="2026-01-23T01:07:01.375299169Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 01:07:01.375771 kubelet[2882]: E0123 01:07:01.375612 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:07:01.375857 kubelet[2882]: E0123 01:07:01.375793 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:07:01.375985 kubelet[2882]: E0123 01:07:01.375935 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:01.377736 containerd[1649]: time="2026-01-23T01:07:01.377486177Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 01:07:01.721012 containerd[1649]: time="2026-01-23T01:07:01.720621871Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:01.723347 containerd[1649]: time="2026-01-23T01:07:01.723206446Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 01:07:01.723484 containerd[1649]: time="2026-01-23T01:07:01.723299408Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 01:07:01.723846 kubelet[2882]: E0123 01:07:01.723779 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:07:01.724271 kubelet[2882]: E0123 01:07:01.723866 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:07:01.724271 kubelet[2882]: E0123 01:07:01.724020 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:01.724271 kubelet[2882]: E0123 01:07:01.724111 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:07:02.022794 containerd[1649]: time="2026-01-23T01:07:02.022554202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 01:07:02.346365 containerd[1649]: time="2026-01-23T01:07:02.346260043Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:02.348735 containerd[1649]: time="2026-01-23T01:07:02.348587117Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 01:07:02.348978 containerd[1649]: time="2026-01-23T01:07:02.348670169Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 01:07:02.349084 kubelet[2882]: E0123 01:07:02.348895 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:07:02.349084 kubelet[2882]: E0123 01:07:02.348977 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:07:02.349268 kubelet[2882]: E0123 01:07:02.349095 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sx6dp_calico-system(abca6cb1-d45d-4716-90fc-9fea5bf2bb4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:02.349268 kubelet[2882]: E0123 01:07:02.349151 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:07:09.027464 kubelet[2882]: E0123 01:07:09.027411 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:07:09.027901 kubelet[2882]: E0123 01:07:09.027622 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:07:10.025619 kubelet[2882]: E0123 01:07:10.025549 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:07:11.031286 kubelet[2882]: E0123 01:07:11.031207 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:07:14.027921 kubelet[2882]: E0123 01:07:14.027829 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:07:16.025756 kubelet[2882]: E0123 01:07:16.025593 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:07:20.024028 containerd[1649]: time="2026-01-23T01:07:20.023959934Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 01:07:20.366112 containerd[1649]: time="2026-01-23T01:07:20.366060173Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:20.369149 containerd[1649]: time="2026-01-23T01:07:20.369081273Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 01:07:20.369240 containerd[1649]: time="2026-01-23T01:07:20.369165412Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 01:07:20.369433 kubelet[2882]: E0123 01:07:20.369378 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:07:20.369697 kubelet[2882]: E0123 01:07:20.369440 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:07:20.369697 kubelet[2882]: E0123 01:07:20.369569 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:20.371017 containerd[1649]: time="2026-01-23T01:07:20.370978774Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 01:07:20.904773 containerd[1649]: time="2026-01-23T01:07:20.904458676Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:20.907553 containerd[1649]: time="2026-01-23T01:07:20.907122197Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 01:07:20.907553 containerd[1649]: time="2026-01-23T01:07:20.907247102Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 01:07:20.908139 kubelet[2882]: E0123 01:07:20.907896 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:07:20.908139 kubelet[2882]: E0123 01:07:20.908027 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:07:20.909687 kubelet[2882]: E0123 01:07:20.909591 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:20.909834 kubelet[2882]: E0123 01:07:20.909741 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:07:21.028422 containerd[1649]: time="2026-01-23T01:07:21.027129680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:07:21.367665 containerd[1649]: time="2026-01-23T01:07:21.367618688Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:21.369472 containerd[1649]: time="2026-01-23T01:07:21.369414995Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:07:21.369562 containerd[1649]: time="2026-01-23T01:07:21.369517904Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:07:21.369718 kubelet[2882]: E0123 01:07:21.369678 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:07:21.369970 kubelet[2882]: E0123 01:07:21.369730 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:07:21.369970 kubelet[2882]: E0123 01:07:21.369842 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-hrfjp_calico-apiserver(ade066f3-57a6-4b59-8736-58fe8e8e36bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:21.369970 kubelet[2882]: E0123 01:07:21.369926 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:07:22.024224 containerd[1649]: time="2026-01-23T01:07:22.024184419Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:07:22.342126 containerd[1649]: time="2026-01-23T01:07:22.342006146Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:22.344076 containerd[1649]: time="2026-01-23T01:07:22.343983542Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:07:22.344076 containerd[1649]: time="2026-01-23T01:07:22.344058470Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:07:22.344327 kubelet[2882]: E0123 01:07:22.344298 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:07:22.344411 kubelet[2882]: E0123 01:07:22.344400 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:07:22.345000 kubelet[2882]: E0123 01:07:22.344656 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-pnljq_calico-apiserver(38be9c86-8462-40da-b6c5-51dc537715d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:22.345000 kubelet[2882]: E0123 01:07:22.344693 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:07:22.345735 containerd[1649]: time="2026-01-23T01:07:22.345566933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 01:07:22.686140 containerd[1649]: time="2026-01-23T01:07:22.685998007Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:22.688235 containerd[1649]: time="2026-01-23T01:07:22.688115517Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 01:07:22.688235 containerd[1649]: time="2026-01-23T01:07:22.688195548Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 01:07:22.688947 kubelet[2882]: E0123 01:07:22.688542 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:07:22.688947 kubelet[2882]: E0123 01:07:22.688593 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:07:22.688947 kubelet[2882]: E0123 01:07:22.688706 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7cd6c7ccb7-hrmnq_calico-system(f7eac782-ccd6-467d-a1a8-1d1f5f096853): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:22.689574 kubelet[2882]: E0123 01:07:22.688750 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:07:27.024268 containerd[1649]: time="2026-01-23T01:07:27.023651097Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 01:07:27.353359 containerd[1649]: time="2026-01-23T01:07:27.353240310Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:27.355060 containerd[1649]: time="2026-01-23T01:07:27.355001484Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 01:07:27.355142 containerd[1649]: time="2026-01-23T01:07:27.355104556Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 01:07:27.355391 kubelet[2882]: E0123 01:07:27.355340 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:07:27.355868 kubelet[2882]: E0123 01:07:27.355399 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:07:27.355868 kubelet[2882]: E0123 01:07:27.355493 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:27.357549 containerd[1649]: time="2026-01-23T01:07:27.357473644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 01:07:27.684211 containerd[1649]: time="2026-01-23T01:07:27.682347261Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:27.686454 containerd[1649]: time="2026-01-23T01:07:27.686217267Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 01:07:27.686454 containerd[1649]: time="2026-01-23T01:07:27.686387513Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 01:07:27.686764 kubelet[2882]: E0123 01:07:27.686686 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:07:27.686864 kubelet[2882]: E0123 01:07:27.686772 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:07:27.686941 kubelet[2882]: E0123 01:07:27.686913 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:27.687095 kubelet[2882]: E0123 01:07:27.687004 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:07:28.023081 containerd[1649]: time="2026-01-23T01:07:28.022615426Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 01:07:28.402760 containerd[1649]: time="2026-01-23T01:07:28.402714935Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:07:28.405018 containerd[1649]: time="2026-01-23T01:07:28.404935397Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 01:07:28.405086 containerd[1649]: time="2026-01-23T01:07:28.405003244Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 01:07:28.405179 kubelet[2882]: E0123 01:07:28.405139 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:07:28.405739 kubelet[2882]: E0123 01:07:28.405191 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:07:28.405739 kubelet[2882]: E0123 01:07:28.405278 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sx6dp_calico-system(abca6cb1-d45d-4716-90fc-9fea5bf2bb4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 01:07:28.405739 kubelet[2882]: E0123 01:07:28.405311 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:07:33.024264 kubelet[2882]: E0123 01:07:33.024226 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:07:34.025212 kubelet[2882]: E0123 01:07:34.025122 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:07:35.023942 kubelet[2882]: E0123 01:07:35.023884 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:07:36.024302 kubelet[2882]: E0123 01:07:36.024254 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:07:40.023136 kubelet[2882]: E0123 01:07:40.023087 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:07:43.026228 kubelet[2882]: E0123 01:07:43.026086 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:07:46.024680 kubelet[2882]: E0123 01:07:46.023477 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:07:47.024302 kubelet[2882]: E0123 01:07:47.024255 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:07:47.025155 kubelet[2882]: E0123 01:07:47.025027 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:07:49.030002 kubelet[2882]: E0123 01:07:49.029234 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:07:53.023962 kubelet[2882]: E0123 01:07:53.023785 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:07:57.023330 kubelet[2882]: E0123 01:07:57.023252 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:07:58.023103 kubelet[2882]: E0123 01:07:58.022751 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:07:59.030715 kubelet[2882]: E0123 01:07:59.030611 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:08:00.022188 kubelet[2882]: E0123 01:08:00.022155 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:08:03.025481 containerd[1649]: time="2026-01-23T01:08:03.025429473Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 01:08:03.353852 containerd[1649]: time="2026-01-23T01:08:03.353810151Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:03.355624 containerd[1649]: time="2026-01-23T01:08:03.355583904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 01:08:03.355755 containerd[1649]: time="2026-01-23T01:08:03.355605949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 01:08:03.355907 kubelet[2882]: E0123 01:08:03.355823 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:08:03.355907 kubelet[2882]: E0123 01:08:03.355861 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:08:03.356839 kubelet[2882]: E0123 01:08:03.356637 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:03.357853 containerd[1649]: time="2026-01-23T01:08:03.357680927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 01:08:03.683796 containerd[1649]: time="2026-01-23T01:08:03.683359629Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:03.685883 containerd[1649]: time="2026-01-23T01:08:03.685790947Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 01:08:03.685883 containerd[1649]: time="2026-01-23T01:08:03.685850576Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 01:08:03.686367 kubelet[2882]: E0123 01:08:03.686136 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:08:03.686367 kubelet[2882]: E0123 01:08:03.686175 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:08:03.686539 kubelet[2882]: E0123 01:08:03.686525 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:03.687071 kubelet[2882]: E0123 01:08:03.687042 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:08:08.022552 kubelet[2882]: E0123 01:08:08.022506 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:08:11.025168 containerd[1649]: time="2026-01-23T01:08:11.024351224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 01:08:11.353397 containerd[1649]: time="2026-01-23T01:08:11.353247655Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:11.355068 containerd[1649]: time="2026-01-23T01:08:11.354966451Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 01:08:11.355068 containerd[1649]: time="2026-01-23T01:08:11.355041024Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 01:08:11.355227 kubelet[2882]: E0123 01:08:11.355183 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:08:11.355456 kubelet[2882]: E0123 01:08:11.355235 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:08:11.355456 kubelet[2882]: E0123 01:08:11.355378 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:11.357735 containerd[1649]: time="2026-01-23T01:08:11.357666815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 01:08:11.693632 containerd[1649]: time="2026-01-23T01:08:11.693218418Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:11.695191 containerd[1649]: time="2026-01-23T01:08:11.695103799Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 01:08:11.695300 containerd[1649]: time="2026-01-23T01:08:11.695267383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 01:08:11.695491 kubelet[2882]: E0123 01:08:11.695453 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:08:11.695631 kubelet[2882]: E0123 01:08:11.695532 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:08:11.695764 kubelet[2882]: E0123 01:08:11.695685 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:11.696094 kubelet[2882]: E0123 01:08:11.695720 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:08:12.023242 containerd[1649]: time="2026-01-23T01:08:12.022672115Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 01:08:12.382945 containerd[1649]: time="2026-01-23T01:08:12.382821871Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:12.385485 containerd[1649]: time="2026-01-23T01:08:12.385427381Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 01:08:12.385687 containerd[1649]: time="2026-01-23T01:08:12.385578144Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 01:08:12.386111 kubelet[2882]: E0123 01:08:12.386047 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:08:12.387146 kubelet[2882]: E0123 01:08:12.386638 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:08:12.387146 kubelet[2882]: E0123 01:08:12.386778 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7cd6c7ccb7-hrmnq_calico-system(f7eac782-ccd6-467d-a1a8-1d1f5f096853): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:12.387146 kubelet[2882]: E0123 01:08:12.386842 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:08:13.023769 containerd[1649]: time="2026-01-23T01:08:13.023498053Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:08:13.359026 containerd[1649]: time="2026-01-23T01:08:13.358958644Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:13.361246 containerd[1649]: time="2026-01-23T01:08:13.361207494Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:08:13.361504 containerd[1649]: time="2026-01-23T01:08:13.361297557Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:08:13.361677 kubelet[2882]: E0123 01:08:13.361635 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:08:13.361792 kubelet[2882]: E0123 01:08:13.361768 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:08:13.362157 kubelet[2882]: E0123 01:08:13.361963 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-pnljq_calico-apiserver(38be9c86-8462-40da-b6c5-51dc537715d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:13.362292 kubelet[2882]: E0123 01:08:13.362248 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:08:14.025455 containerd[1649]: time="2026-01-23T01:08:14.024449241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:08:14.359326 containerd[1649]: time="2026-01-23T01:08:14.359247569Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:14.361287 containerd[1649]: time="2026-01-23T01:08:14.361228705Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:08:14.361374 containerd[1649]: time="2026-01-23T01:08:14.361351516Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:08:14.361657 kubelet[2882]: E0123 01:08:14.361615 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:08:14.361900 kubelet[2882]: E0123 01:08:14.361674 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:08:14.361900 kubelet[2882]: E0123 01:08:14.361771 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-hrfjp_calico-apiserver(ade066f3-57a6-4b59-8736-58fe8e8e36bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:14.361900 kubelet[2882]: E0123 01:08:14.361818 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:08:15.027415 kubelet[2882]: E0123 01:08:15.027321 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:08:23.023068 containerd[1649]: time="2026-01-23T01:08:23.023012437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 01:08:23.362541 containerd[1649]: time="2026-01-23T01:08:23.361719973Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:08:23.363756 containerd[1649]: time="2026-01-23T01:08:23.363713517Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 01:08:23.363869 containerd[1649]: time="2026-01-23T01:08:23.363805304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 01:08:23.364037 kubelet[2882]: E0123 01:08:23.363997 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:08:23.364461 kubelet[2882]: E0123 01:08:23.364058 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:08:23.364461 kubelet[2882]: E0123 01:08:23.364163 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sx6dp_calico-system(abca6cb1-d45d-4716-90fc-9fea5bf2bb4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 01:08:23.364602 kubelet[2882]: E0123 01:08:23.364574 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:08:24.026053 kubelet[2882]: E0123 01:08:24.025731 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:08:25.027898 kubelet[2882]: E0123 01:08:25.027828 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:08:26.022392 kubelet[2882]: E0123 01:08:26.022357 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:08:26.023460 kubelet[2882]: E0123 01:08:26.023426 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:08:29.026527 kubelet[2882]: E0123 01:08:29.024717 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:08:31.102814 systemd[1]: Started sshd@9-10.0.5.114:22-20.161.92.111:47754.service - OpenSSH per-connection server daemon (20.161.92.111:47754). Jan 23 01:08:31.762703 sshd[5021]: Accepted publickey for core from 20.161.92.111 port 47754 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:08:31.765965 sshd-session[5021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:08:31.778323 systemd-logind[1625]: New session 10 of user core. Jan 23 01:08:31.787678 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 23 01:08:32.306035 sshd[5024]: Connection closed by 20.161.92.111 port 47754 Jan 23 01:08:32.305684 sshd-session[5021]: pam_unix(sshd:session): session closed for user core Jan 23 01:08:32.309500 systemd[1]: sshd@9-10.0.5.114:22-20.161.92.111:47754.service: Deactivated successfully. Jan 23 01:08:32.312007 systemd[1]: session-10.scope: Deactivated successfully. Jan 23 01:08:32.313319 systemd-logind[1625]: Session 10 logged out. Waiting for processes to exit. Jan 23 01:08:32.316599 systemd-logind[1625]: Removed session 10. Jan 23 01:08:37.027572 kubelet[2882]: E0123 01:08:37.027523 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:08:37.028527 kubelet[2882]: E0123 01:08:37.028227 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:08:37.422114 systemd[1]: Started sshd@10-10.0.5.114:22-20.161.92.111:40534.service - OpenSSH per-connection server daemon (20.161.92.111:40534). Jan 23 01:08:38.024039 kubelet[2882]: E0123 01:08:38.023503 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:08:38.085084 sshd[5038]: Accepted publickey for core from 20.161.92.111 port 40534 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:08:38.086599 sshd-session[5038]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:08:38.094636 systemd-logind[1625]: New session 11 of user core. Jan 23 01:08:38.100792 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 23 01:08:38.626666 sshd[5041]: Connection closed by 20.161.92.111 port 40534 Jan 23 01:08:38.627221 sshd-session[5038]: pam_unix(sshd:session): session closed for user core Jan 23 01:08:38.630215 systemd-logind[1625]: Session 11 logged out. Waiting for processes to exit. Jan 23 01:08:38.630655 systemd[1]: sshd@10-10.0.5.114:22-20.161.92.111:40534.service: Deactivated successfully. Jan 23 01:08:38.633941 systemd[1]: session-11.scope: Deactivated successfully. Jan 23 01:08:38.636348 systemd-logind[1625]: Removed session 11. Jan 23 01:08:39.026649 kubelet[2882]: E0123 01:08:39.025263 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:08:41.025073 kubelet[2882]: E0123 01:08:41.024341 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:08:42.022759 kubelet[2882]: E0123 01:08:42.022722 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:08:43.739604 systemd[1]: Started sshd@11-10.0.5.114:22-20.161.92.111:43384.service - OpenSSH per-connection server daemon (20.161.92.111:43384). Jan 23 01:08:44.359402 sshd[5078]: Accepted publickey for core from 20.161.92.111 port 43384 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:08:44.362496 sshd-session[5078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:08:44.374544 systemd-logind[1625]: New session 12 of user core. Jan 23 01:08:44.379768 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 23 01:08:44.896131 sshd[5081]: Connection closed by 20.161.92.111 port 43384 Jan 23 01:08:44.897665 sshd-session[5078]: pam_unix(sshd:session): session closed for user core Jan 23 01:08:44.901123 systemd[1]: sshd@11-10.0.5.114:22-20.161.92.111:43384.service: Deactivated successfully. Jan 23 01:08:44.903505 systemd[1]: session-12.scope: Deactivated successfully. Jan 23 01:08:44.906082 systemd-logind[1625]: Session 12 logged out. Waiting for processes to exit. Jan 23 01:08:44.906901 systemd-logind[1625]: Removed session 12. Jan 23 01:08:45.008767 systemd[1]: Started sshd@12-10.0.5.114:22-20.161.92.111:43390.service - OpenSSH per-connection server daemon (20.161.92.111:43390). Jan 23 01:08:45.617268 sshd[5094]: Accepted publickey for core from 20.161.92.111 port 43390 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:08:45.618379 sshd-session[5094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:08:45.626386 systemd-logind[1625]: New session 13 of user core. Jan 23 01:08:45.635253 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 23 01:08:46.204293 sshd[5097]: Connection closed by 20.161.92.111 port 43390 Jan 23 01:08:46.204851 sshd-session[5094]: pam_unix(sshd:session): session closed for user core Jan 23 01:08:46.208936 systemd[1]: sshd@12-10.0.5.114:22-20.161.92.111:43390.service: Deactivated successfully. Jan 23 01:08:46.212204 systemd[1]: session-13.scope: Deactivated successfully. Jan 23 01:08:46.213958 systemd-logind[1625]: Session 13 logged out. Waiting for processes to exit. Jan 23 01:08:46.216134 systemd-logind[1625]: Removed session 13. Jan 23 01:08:46.313593 systemd[1]: Started sshd@13-10.0.5.114:22-20.161.92.111:43398.service - OpenSSH per-connection server daemon (20.161.92.111:43398). Jan 23 01:08:46.934552 sshd[5107]: Accepted publickey for core from 20.161.92.111 port 43398 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:08:46.935524 sshd-session[5107]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:08:46.939332 systemd-logind[1625]: New session 14 of user core. Jan 23 01:08:46.949662 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 23 01:08:47.463744 sshd[5110]: Connection closed by 20.161.92.111 port 43398 Jan 23 01:08:47.464727 sshd-session[5107]: pam_unix(sshd:session): session closed for user core Jan 23 01:08:47.470176 systemd[1]: sshd@13-10.0.5.114:22-20.161.92.111:43398.service: Deactivated successfully. Jan 23 01:08:47.472055 systemd[1]: session-14.scope: Deactivated successfully. Jan 23 01:08:47.473015 systemd-logind[1625]: Session 14 logged out. Waiting for processes to exit. Jan 23 01:08:47.474561 systemd-logind[1625]: Removed session 14. Jan 23 01:08:49.025133 kubelet[2882]: E0123 01:08:49.024152 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:08:52.022878 kubelet[2882]: E0123 01:08:52.021638 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:08:52.023477 kubelet[2882]: E0123 01:08:52.023446 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:08:52.572684 systemd[1]: Started sshd@14-10.0.5.114:22-20.161.92.111:58314.service - OpenSSH per-connection server daemon (20.161.92.111:58314). Jan 23 01:08:53.024468 kubelet[2882]: E0123 01:08:53.023357 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:08:53.194758 sshd[5128]: Accepted publickey for core from 20.161.92.111 port 58314 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:08:53.196218 sshd-session[5128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:08:53.202890 systemd-logind[1625]: New session 15 of user core. Jan 23 01:08:53.205779 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 23 01:08:53.718406 sshd[5131]: Connection closed by 20.161.92.111 port 58314 Jan 23 01:08:53.719042 sshd-session[5128]: pam_unix(sshd:session): session closed for user core Jan 23 01:08:53.728639 systemd[1]: sshd@14-10.0.5.114:22-20.161.92.111:58314.service: Deactivated successfully. Jan 23 01:08:53.735539 systemd[1]: session-15.scope: Deactivated successfully. Jan 23 01:08:53.737264 systemd-logind[1625]: Session 15 logged out. Waiting for processes to exit. Jan 23 01:08:53.739663 systemd-logind[1625]: Removed session 15. Jan 23 01:08:54.025970 kubelet[2882]: E0123 01:08:54.025363 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:08:55.023787 kubelet[2882]: E0123 01:08:55.023723 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:08:58.836694 systemd[1]: Started sshd@15-10.0.5.114:22-20.161.92.111:58326.service - OpenSSH per-connection server daemon (20.161.92.111:58326). Jan 23 01:08:59.467059 sshd[5143]: Accepted publickey for core from 20.161.92.111 port 58326 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:08:59.469039 sshd-session[5143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:08:59.476728 systemd-logind[1625]: New session 16 of user core. Jan 23 01:08:59.480591 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 23 01:08:59.949276 sshd[5148]: Connection closed by 20.161.92.111 port 58326 Jan 23 01:08:59.950114 sshd-session[5143]: pam_unix(sshd:session): session closed for user core Jan 23 01:08:59.956871 systemd[1]: sshd@15-10.0.5.114:22-20.161.92.111:58326.service: Deactivated successfully. Jan 23 01:08:59.961677 systemd[1]: session-16.scope: Deactivated successfully. Jan 23 01:08:59.963483 systemd-logind[1625]: Session 16 logged out. Waiting for processes to exit. Jan 23 01:08:59.965109 systemd-logind[1625]: Removed session 16. Jan 23 01:09:01.023249 kubelet[2882]: E0123 01:09:01.023194 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:09:05.023132 kubelet[2882]: E0123 01:09:05.023095 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:09:05.057355 systemd[1]: Started sshd@16-10.0.5.114:22-20.161.92.111:57770.service - OpenSSH per-connection server daemon (20.161.92.111:57770). Jan 23 01:09:05.663675 sshd[5159]: Accepted publickey for core from 20.161.92.111 port 57770 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:05.664869 sshd-session[5159]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:05.671634 systemd-logind[1625]: New session 17 of user core. Jan 23 01:09:05.677673 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 23 01:09:06.023983 kubelet[2882]: E0123 01:09:06.023888 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:09:06.159944 sshd[5165]: Connection closed by 20.161.92.111 port 57770 Jan 23 01:09:06.161195 sshd-session[5159]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:06.165002 systemd-logind[1625]: Session 17 logged out. Waiting for processes to exit. Jan 23 01:09:06.165715 systemd[1]: sshd@16-10.0.5.114:22-20.161.92.111:57770.service: Deactivated successfully. Jan 23 01:09:06.168201 systemd[1]: session-17.scope: Deactivated successfully. Jan 23 01:09:06.171015 systemd-logind[1625]: Removed session 17. Jan 23 01:09:06.269603 systemd[1]: Started sshd@17-10.0.5.114:22-20.161.92.111:57774.service - OpenSSH per-connection server daemon (20.161.92.111:57774). Jan 23 01:09:06.881050 sshd[5176]: Accepted publickey for core from 20.161.92.111 port 57774 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:06.882138 sshd-session[5176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:06.885842 systemd-logind[1625]: New session 18 of user core. Jan 23 01:09:06.894640 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 23 01:09:07.023236 kubelet[2882]: E0123 01:09:07.023195 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:09:07.890709 sshd[5179]: Connection closed by 20.161.92.111 port 57774 Jan 23 01:09:07.892580 sshd-session[5176]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:07.897954 systemd[1]: sshd@17-10.0.5.114:22-20.161.92.111:57774.service: Deactivated successfully. Jan 23 01:09:07.901135 systemd[1]: session-18.scope: Deactivated successfully. Jan 23 01:09:07.903102 systemd-logind[1625]: Session 18 logged out. Waiting for processes to exit. Jan 23 01:09:07.904243 systemd-logind[1625]: Removed session 18. Jan 23 01:09:08.000996 systemd[1]: Started sshd@18-10.0.5.114:22-20.161.92.111:57780.service - OpenSSH per-connection server daemon (20.161.92.111:57780). Jan 23 01:09:08.022828 kubelet[2882]: E0123 01:09:08.022785 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:09:08.632061 sshd[5189]: Accepted publickey for core from 20.161.92.111 port 57780 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:08.633504 sshd-session[5189]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:08.646201 systemd-logind[1625]: New session 19 of user core. Jan 23 01:09:08.652843 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 23 01:09:09.028212 kubelet[2882]: E0123 01:09:09.028085 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:09:09.582170 sshd[5216]: Connection closed by 20.161.92.111 port 57780 Jan 23 01:09:09.582734 sshd-session[5189]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:09.587303 systemd[1]: sshd@18-10.0.5.114:22-20.161.92.111:57780.service: Deactivated successfully. Jan 23 01:09:09.590446 systemd[1]: session-19.scope: Deactivated successfully. Jan 23 01:09:09.591771 systemd-logind[1625]: Session 19 logged out. Waiting for processes to exit. Jan 23 01:09:09.594278 systemd-logind[1625]: Removed session 19. Jan 23 01:09:09.688168 systemd[1]: Started sshd@19-10.0.5.114:22-20.161.92.111:57794.service - OpenSSH per-connection server daemon (20.161.92.111:57794). Jan 23 01:09:10.296387 sshd[5231]: Accepted publickey for core from 20.161.92.111 port 57794 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:10.298609 sshd-session[5231]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:10.305541 systemd-logind[1625]: New session 20 of user core. Jan 23 01:09:10.307640 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 23 01:09:10.929991 sshd[5234]: Connection closed by 20.161.92.111 port 57794 Jan 23 01:09:10.930351 sshd-session[5231]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:10.933844 systemd[1]: sshd@19-10.0.5.114:22-20.161.92.111:57794.service: Deactivated successfully. Jan 23 01:09:10.935954 systemd[1]: session-20.scope: Deactivated successfully. Jan 23 01:09:10.938948 systemd-logind[1625]: Session 20 logged out. Waiting for processes to exit. Jan 23 01:09:10.939822 systemd-logind[1625]: Removed session 20. Jan 23 01:09:11.035227 systemd[1]: Started sshd@20-10.0.5.114:22-20.161.92.111:57798.service - OpenSSH per-connection server daemon (20.161.92.111:57798). Jan 23 01:09:11.654595 sshd[5246]: Accepted publickey for core from 20.161.92.111 port 57798 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:11.657396 sshd-session[5246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:11.670985 systemd-logind[1625]: New session 21 of user core. Jan 23 01:09:11.677846 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 23 01:09:12.140211 sshd[5249]: Connection closed by 20.161.92.111 port 57798 Jan 23 01:09:12.140884 sshd-session[5246]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:12.144904 systemd-logind[1625]: Session 21 logged out. Waiting for processes to exit. Jan 23 01:09:12.145035 systemd[1]: sshd@20-10.0.5.114:22-20.161.92.111:57798.service: Deactivated successfully. Jan 23 01:09:12.146877 systemd[1]: session-21.scope: Deactivated successfully. Jan 23 01:09:12.148505 systemd-logind[1625]: Removed session 21. Jan 23 01:09:15.024802 kubelet[2882]: E0123 01:09:15.024465 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:09:17.250721 systemd[1]: Started sshd@21-10.0.5.114:22-20.161.92.111:36040.service - OpenSSH per-connection server daemon (20.161.92.111:36040). Jan 23 01:09:17.857358 sshd[5263]: Accepted publickey for core from 20.161.92.111 port 36040 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:17.860462 sshd-session[5263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:17.878137 systemd-logind[1625]: New session 22 of user core. Jan 23 01:09:17.886854 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 23 01:09:18.351331 sshd[5266]: Connection closed by 20.161.92.111 port 36040 Jan 23 01:09:18.350350 sshd-session[5263]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:18.354234 systemd-logind[1625]: Session 22 logged out. Waiting for processes to exit. Jan 23 01:09:18.354857 systemd[1]: sshd@21-10.0.5.114:22-20.161.92.111:36040.service: Deactivated successfully. Jan 23 01:09:18.357858 systemd[1]: session-22.scope: Deactivated successfully. Jan 23 01:09:18.362299 systemd-logind[1625]: Removed session 22. Jan 23 01:09:19.023753 kubelet[2882]: E0123 01:09:19.023703 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:09:19.024684 kubelet[2882]: E0123 01:09:19.024493 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:09:22.022712 kubelet[2882]: E0123 01:09:22.022652 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:09:23.023944 kubelet[2882]: E0123 01:09:23.023844 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:09:23.025920 kubelet[2882]: E0123 01:09:23.025593 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:09:23.458955 systemd[1]: Started sshd@22-10.0.5.114:22-20.161.92.111:58512.service - OpenSSH per-connection server daemon (20.161.92.111:58512). Jan 23 01:09:24.086969 sshd[5283]: Accepted publickey for core from 20.161.92.111 port 58512 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:24.091843 sshd-session[5283]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:24.102166 systemd-logind[1625]: New session 23 of user core. Jan 23 01:09:24.109790 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 23 01:09:24.630549 sshd[5286]: Connection closed by 20.161.92.111 port 58512 Jan 23 01:09:24.631300 sshd-session[5283]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:24.635913 systemd[1]: sshd@22-10.0.5.114:22-20.161.92.111:58512.service: Deactivated successfully. Jan 23 01:09:24.639830 systemd[1]: session-23.scope: Deactivated successfully. Jan 23 01:09:24.641192 systemd-logind[1625]: Session 23 logged out. Waiting for processes to exit. Jan 23 01:09:24.642384 systemd-logind[1625]: Removed session 23. Jan 23 01:09:27.023244 kubelet[2882]: E0123 01:09:27.023214 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:09:29.734704 systemd[1]: Started sshd@23-10.0.5.114:22-20.161.92.111:58516.service - OpenSSH per-connection server daemon (20.161.92.111:58516). Jan 23 01:09:30.347681 sshd[5299]: Accepted publickey for core from 20.161.92.111 port 58516 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:30.350577 sshd-session[5299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:30.364255 systemd-logind[1625]: New session 24 of user core. Jan 23 01:09:30.371244 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 23 01:09:30.886275 sshd[5302]: Connection closed by 20.161.92.111 port 58516 Jan 23 01:09:30.885468 sshd-session[5299]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:30.892074 systemd-logind[1625]: Session 24 logged out. Waiting for processes to exit. Jan 23 01:09:30.892838 systemd[1]: sshd@23-10.0.5.114:22-20.161.92.111:58516.service: Deactivated successfully. Jan 23 01:09:30.897366 systemd[1]: session-24.scope: Deactivated successfully. Jan 23 01:09:30.899761 systemd-logind[1625]: Removed session 24. Jan 23 01:09:32.022908 kubelet[2882]: E0123 01:09:32.022869 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:09:33.024808 containerd[1649]: time="2026-01-23T01:09:33.024343894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 01:09:33.382247 containerd[1649]: time="2026-01-23T01:09:33.382202527Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:33.385577 containerd[1649]: time="2026-01-23T01:09:33.384799601Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 01:09:33.385772 containerd[1649]: time="2026-01-23T01:09:33.384813273Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 01:09:33.385978 kubelet[2882]: E0123 01:09:33.385946 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:09:33.386776 kubelet[2882]: E0123 01:09:33.385989 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 01:09:33.386776 kubelet[2882]: E0123 01:09:33.386060 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7cd6c7ccb7-hrmnq_calico-system(f7eac782-ccd6-467d-a1a8-1d1f5f096853): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:33.386776 kubelet[2882]: E0123 01:09:33.386089 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:09:34.026339 containerd[1649]: time="2026-01-23T01:09:34.025818664Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 01:09:34.367926 containerd[1649]: time="2026-01-23T01:09:34.367879458Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:34.369664 containerd[1649]: time="2026-01-23T01:09:34.369630249Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 01:09:34.369720 containerd[1649]: time="2026-01-23T01:09:34.369696444Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 01:09:34.369874 kubelet[2882]: E0123 01:09:34.369840 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:09:34.369933 kubelet[2882]: E0123 01:09:34.369878 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 01:09:34.369955 kubelet[2882]: E0123 01:09:34.369945 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:34.371155 containerd[1649]: time="2026-01-23T01:09:34.370954576Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 01:09:34.702615 containerd[1649]: time="2026-01-23T01:09:34.702497180Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:34.704330 containerd[1649]: time="2026-01-23T01:09:34.704277381Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 01:09:34.704414 containerd[1649]: time="2026-01-23T01:09:34.704310373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 01:09:34.704553 kubelet[2882]: E0123 01:09:34.704500 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:09:34.704826 kubelet[2882]: E0123 01:09:34.704564 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 01:09:34.704826 kubelet[2882]: E0123 01:09:34.704643 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p7kwl_calico-system(bfc1d39e-fed0-4ad0-8a64-aa0c649c314e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:34.704826 kubelet[2882]: E0123 01:09:34.704709 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:09:36.000849 systemd[1]: Started sshd@24-10.0.5.114:22-20.161.92.111:59038.service - OpenSSH per-connection server daemon (20.161.92.111:59038). Jan 23 01:09:36.024523 containerd[1649]: time="2026-01-23T01:09:36.024466758Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:09:36.358790 containerd[1649]: time="2026-01-23T01:09:36.358559567Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:36.361109 containerd[1649]: time="2026-01-23T01:09:36.360809731Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:09:36.361258 containerd[1649]: time="2026-01-23T01:09:36.360869466Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:09:36.361523 kubelet[2882]: E0123 01:09:36.361444 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:09:36.361523 kubelet[2882]: E0123 01:09:36.361480 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:09:36.362526 kubelet[2882]: E0123 01:09:36.361992 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-pnljq_calico-apiserver(38be9c86-8462-40da-b6c5-51dc537715d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:36.362526 kubelet[2882]: E0123 01:09:36.362036 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:09:36.638844 sshd[5315]: Accepted publickey for core from 20.161.92.111 port 59038 ssh2: RSA SHA256:tQIJN5HlXk0+c/kUIMdsIlPUXB6L6udcPrUheN99J8w Jan 23 01:09:36.641561 sshd-session[5315]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 01:09:36.646873 systemd-logind[1625]: New session 25 of user core. Jan 23 01:09:36.654799 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 23 01:09:37.023890 containerd[1649]: time="2026-01-23T01:09:37.023785165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 01:09:37.168491 sshd[5318]: Connection closed by 20.161.92.111 port 59038 Jan 23 01:09:37.169626 sshd-session[5315]: pam_unix(sshd:session): session closed for user core Jan 23 01:09:37.179697 systemd-logind[1625]: Session 25 logged out. Waiting for processes to exit. Jan 23 01:09:37.182581 systemd[1]: sshd@24-10.0.5.114:22-20.161.92.111:59038.service: Deactivated successfully. Jan 23 01:09:37.186474 systemd[1]: session-25.scope: Deactivated successfully. Jan 23 01:09:37.189746 systemd-logind[1625]: Removed session 25. Jan 23 01:09:37.373168 containerd[1649]: time="2026-01-23T01:09:37.372744093Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:37.375355 containerd[1649]: time="2026-01-23T01:09:37.375189963Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 01:09:37.375355 containerd[1649]: time="2026-01-23T01:09:37.375260443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 01:09:37.375794 kubelet[2882]: E0123 01:09:37.375736 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:09:37.376232 kubelet[2882]: E0123 01:09:37.376053 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 01:09:37.376495 kubelet[2882]: E0123 01:09:37.376441 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:37.378581 containerd[1649]: time="2026-01-23T01:09:37.378549747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 01:09:37.719494 containerd[1649]: time="2026-01-23T01:09:37.719386285Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:37.721361 containerd[1649]: time="2026-01-23T01:09:37.721286084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 01:09:37.721361 containerd[1649]: time="2026-01-23T01:09:37.721334458Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 01:09:37.721593 kubelet[2882]: E0123 01:09:37.721564 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:09:37.721800 kubelet[2882]: E0123 01:09:37.721659 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 01:09:37.721800 kubelet[2882]: E0123 01:09:37.721730 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-86c8df774c-zx58q_calico-system(1e833cc8-d29c-40ae-955f-226c562444ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:37.721800 kubelet[2882]: E0123 01:09:37.721766 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:09:39.029263 kubelet[2882]: E0123 01:09:39.029222 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:09:47.025068 containerd[1649]: time="2026-01-23T01:09:47.025032157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 01:09:47.356522 containerd[1649]: time="2026-01-23T01:09:47.356460849Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:47.358412 containerd[1649]: time="2026-01-23T01:09:47.358119934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 01:09:47.358412 containerd[1649]: time="2026-01-23T01:09:47.358120766Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 01:09:47.359295 kubelet[2882]: E0123 01:09:47.358720 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:09:47.359295 kubelet[2882]: E0123 01:09:47.358759 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 01:09:47.359295 kubelet[2882]: E0123 01:09:47.358824 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57669cbdbb-hrfjp_calico-apiserver(ade066f3-57a6-4b59-8736-58fe8e8e36bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:47.359295 kubelet[2882]: E0123 01:09:47.358851 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:09:48.023702 kubelet[2882]: E0123 01:09:48.023448 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:09:49.025588 kubelet[2882]: E0123 01:09:49.025478 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:09:51.024754 containerd[1649]: time="2026-01-23T01:09:51.024657494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 01:09:51.026555 kubelet[2882]: E0123 01:09:51.025931 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:09:51.026555 kubelet[2882]: E0123 01:09:51.026228 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:09:51.363023 containerd[1649]: time="2026-01-23T01:09:51.362793839Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 01:09:51.365561 containerd[1649]: time="2026-01-23T01:09:51.365280172Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 01:09:51.365561 containerd[1649]: time="2026-01-23T01:09:51.365384392Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 01:09:51.366107 kubelet[2882]: E0123 01:09:51.365572 2882 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:09:51.366107 kubelet[2882]: E0123 01:09:51.365621 2882 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 01:09:51.366107 kubelet[2882]: E0123 01:09:51.365710 2882 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sx6dp_calico-system(abca6cb1-d45d-4716-90fc-9fea5bf2bb4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 01:09:51.366107 kubelet[2882]: E0123 01:09:51.365752 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:09:51.623613 update_engine[1630]: I20260123 01:09:51.622695 1630 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 23 01:09:51.623613 update_engine[1630]: I20260123 01:09:51.622768 1630 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 23 01:09:51.623613 update_engine[1630]: I20260123 01:09:51.623070 1630 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 23 01:09:51.624221 update_engine[1630]: I20260123 01:09:51.624182 1630 omaha_request_params.cc:62] Current group set to stable Jan 23 01:09:51.627494 update_engine[1630]: I20260123 01:09:51.627454 1630 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 23 01:09:51.627494 update_engine[1630]: I20260123 01:09:51.627489 1630 update_attempter.cc:643] Scheduling an action processor start. Jan 23 01:09:51.627590 update_engine[1630]: I20260123 01:09:51.627537 1630 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 23 01:09:51.627611 update_engine[1630]: I20260123 01:09:51.627596 1630 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 23 01:09:51.627713 update_engine[1630]: I20260123 01:09:51.627693 1630 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 23 01:09:51.627734 update_engine[1630]: I20260123 01:09:51.627712 1630 omaha_request_action.cc:272] Request: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627734 update_engine[1630]: Jan 23 01:09:51.627892 update_engine[1630]: I20260123 01:09:51.627724 1630 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 23 01:09:51.638621 locksmithd[1664]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 23 01:09:51.640057 update_engine[1630]: I20260123 01:09:51.639973 1630 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 23 01:09:51.640986 update_engine[1630]: I20260123 01:09:51.640792 1630 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 23 01:09:51.646999 update_engine[1630]: E20260123 01:09:51.646760 1630 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 23 01:09:51.646999 update_engine[1630]: I20260123 01:09:51.646888 1630 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 23 01:10:01.618488 update_engine[1630]: I20260123 01:10:01.618364 1630 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 23 01:10:01.620977 update_engine[1630]: I20260123 01:10:01.618579 1630 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 23 01:10:01.620977 update_engine[1630]: I20260123 01:10:01.620013 1630 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 23 01:10:01.626894 update_engine[1630]: E20260123 01:10:01.626829 1630 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 23 01:10:01.627042 update_engine[1630]: I20260123 01:10:01.627002 1630 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 23 01:10:02.024965 kubelet[2882]: E0123 01:10:02.024774 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-hrfjp" podUID="ade066f3-57a6-4b59-8736-58fe8e8e36bc" Jan 23 01:10:02.990183 kubelet[2882]: E0123 01:10:02.990097 2882 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.5.114:48902->10.0.5.89:2379: read: connection timed out" Jan 23 01:10:03.023000 kubelet[2882]: E0123 01:10:03.022936 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cd6c7ccb7-hrmnq" podUID="f7eac782-ccd6-467d-a1a8-1d1f5f096853" Jan 23 01:10:03.118165 systemd[1]: cri-containerd-ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31.scope: Deactivated successfully. Jan 23 01:10:03.119794 systemd[1]: cri-containerd-ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31.scope: Consumed 31.719s CPU time, 109.8M memory peak. Jan 23 01:10:03.122667 containerd[1649]: time="2026-01-23T01:10:03.122235324Z" level=info msg="received container exit event container_id:\"ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31\" id:\"ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31\" pid:3208 exit_status:1 exited_at:{seconds:1769130603 nanos:121578545}" Jan 23 01:10:03.163374 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31-rootfs.mount: Deactivated successfully. Jan 23 01:10:03.644400 systemd[1]: cri-containerd-1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af.scope: Deactivated successfully. Jan 23 01:10:03.646114 systemd[1]: cri-containerd-1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af.scope: Consumed 3.825s CPU time, 61M memory peak. Jan 23 01:10:03.648869 containerd[1649]: time="2026-01-23T01:10:03.648680155Z" level=info msg="received container exit event container_id:\"1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af\" id:\"1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af\" pid:2713 exit_status:1 exited_at:{seconds:1769130603 nanos:647935196}" Jan 23 01:10:03.702281 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af-rootfs.mount: Deactivated successfully. Jan 23 01:10:03.838486 kubelet[2882]: I0123 01:10:03.838448 2882 scope.go:117] "RemoveContainer" containerID="ba9f0b581b138c60651bffc07f8d59dd595d051d8f01db430f460ac093b90e31" Jan 23 01:10:03.844172 kubelet[2882]: I0123 01:10:03.844104 2882 scope.go:117] "RemoveContainer" containerID="1d71a7ac4b896b66d382793a06300f72b50dfde0ec237b84fd807f5bdcbc61af" Jan 23 01:10:03.851277 containerd[1649]: time="2026-01-23T01:10:03.850798350Z" level=info msg="CreateContainer within sandbox \"c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 23 01:10:03.851277 containerd[1649]: time="2026-01-23T01:10:03.850893080Z" level=info msg="CreateContainer within sandbox \"1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 23 01:10:03.871895 containerd[1649]: time="2026-01-23T01:10:03.871833289Z" level=info msg="Container 6e3ac9cfa479bd3749ec3f24b72c5fedb001f0308deca5c08f9425a2833e9f8f: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:10:03.884013 containerd[1649]: time="2026-01-23T01:10:03.883969748Z" level=info msg="Container 4dea3c2dcea810611676f3fa74681fe60b85fc8ebdaacf2282823ca96bf0c982: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:10:03.900176 containerd[1649]: time="2026-01-23T01:10:03.899910503Z" level=info msg="CreateContainer within sandbox \"1fba6b6905dba4645e88266fe70a4aa142cbe746f93e18bb5b879c0b531bf63b\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"6e3ac9cfa479bd3749ec3f24b72c5fedb001f0308deca5c08f9425a2833e9f8f\"" Jan 23 01:10:03.901257 containerd[1649]: time="2026-01-23T01:10:03.901234902Z" level=info msg="StartContainer for \"6e3ac9cfa479bd3749ec3f24b72c5fedb001f0308deca5c08f9425a2833e9f8f\"" Jan 23 01:10:03.903414 containerd[1649]: time="2026-01-23T01:10:03.903369724Z" level=info msg="connecting to shim 6e3ac9cfa479bd3749ec3f24b72c5fedb001f0308deca5c08f9425a2833e9f8f" address="unix:///run/containerd/s/e67d63aca0acd5730f46303c3a803929322950099b5ba83205749c67d3c853b3" protocol=ttrpc version=3 Jan 23 01:10:03.904748 containerd[1649]: time="2026-01-23T01:10:03.904646310Z" level=info msg="CreateContainer within sandbox \"c4ee34f4173baf8e1c162080964ab7436ae20b48c7779d95eacb7f98f8d25be1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"4dea3c2dcea810611676f3fa74681fe60b85fc8ebdaacf2282823ca96bf0c982\"" Jan 23 01:10:03.905581 containerd[1649]: time="2026-01-23T01:10:03.905560481Z" level=info msg="StartContainer for \"4dea3c2dcea810611676f3fa74681fe60b85fc8ebdaacf2282823ca96bf0c982\"" Jan 23 01:10:03.908038 containerd[1649]: time="2026-01-23T01:10:03.907625651Z" level=info msg="connecting to shim 4dea3c2dcea810611676f3fa74681fe60b85fc8ebdaacf2282823ca96bf0c982" address="unix:///run/containerd/s/2484a299f42437d70d09f69af8414a9bdfc61bbe51ddde82ef6e3dbb06eaeb2b" protocol=ttrpc version=3 Jan 23 01:10:03.932752 systemd[1]: Started cri-containerd-4dea3c2dcea810611676f3fa74681fe60b85fc8ebdaacf2282823ca96bf0c982.scope - libcontainer container 4dea3c2dcea810611676f3fa74681fe60b85fc8ebdaacf2282823ca96bf0c982. Jan 23 01:10:03.941025 systemd[1]: Started cri-containerd-6e3ac9cfa479bd3749ec3f24b72c5fedb001f0308deca5c08f9425a2833e9f8f.scope - libcontainer container 6e3ac9cfa479bd3749ec3f24b72c5fedb001f0308deca5c08f9425a2833e9f8f. Jan 23 01:10:03.999255 containerd[1649]: time="2026-01-23T01:10:03.999205432Z" level=info msg="StartContainer for \"6e3ac9cfa479bd3749ec3f24b72c5fedb001f0308deca5c08f9425a2833e9f8f\" returns successfully" Jan 23 01:10:04.003947 containerd[1649]: time="2026-01-23T01:10:04.003919263Z" level=info msg="StartContainer for \"4dea3c2dcea810611676f3fa74681fe60b85fc8ebdaacf2282823ca96bf0c982\" returns successfully" Jan 23 01:10:04.024093 kubelet[2882]: E0123 01:10:04.024023 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sx6dp" podUID="abca6cb1-d45d-4716-90fc-9fea5bf2bb4c" Jan 23 01:10:04.024562 kubelet[2882]: E0123 01:10:04.024535 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p7kwl" podUID="bfc1d39e-fed0-4ad0-8a64-aa0c649c314e" Jan 23 01:10:05.025339 kubelet[2882]: E0123 01:10:05.024809 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57669cbdbb-pnljq" podUID="38be9c86-8462-40da-b6c5-51dc537715d5" Jan 23 01:10:05.025339 kubelet[2882]: E0123 01:10:05.024981 2882 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-86c8df774c-zx58q" podUID="1e833cc8-d29c-40ae-955f-226c562444ef" Jan 23 01:10:05.096155 kubelet[2882]: E0123 01:10:05.093729 2882 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.5.114:48566->10.0.5.89:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-2-n-615049e46b.188d36e7a101e9a0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-2-n-615049e46b,UID:9308210fbd41571fa9ff2c37ae11c3e8,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-n-615049e46b,},FirstTimestamp:2026-01-23 01:09:57.09307536 +0000 UTC m=+238.165700948,LastTimestamp:2026-01-23 01:09:57.09307536 +0000 UTC m=+238.165700948,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-n-615049e46b,}" Jan 23 01:10:07.697768 systemd[1]: cri-containerd-7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c.scope: Deactivated successfully. Jan 23 01:10:07.699405 systemd[1]: cri-containerd-7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c.scope: Consumed 2.827s CPU time, 22.5M memory peak, 128K read from disk. Jan 23 01:10:07.705748 containerd[1649]: time="2026-01-23T01:10:07.705394372Z" level=info msg="received container exit event container_id:\"7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c\" id:\"7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c\" pid:2735 exit_status:1 exited_at:{seconds:1769130607 nanos:703993194}" Jan 23 01:10:07.762787 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c-rootfs.mount: Deactivated successfully. Jan 23 01:10:07.866403 kubelet[2882]: I0123 01:10:07.866079 2882 scope.go:117] "RemoveContainer" containerID="7b99a02993a681f3fcf5f58477f60b4baf347854d4fcbeaaf6f9c22777fb7f9c" Jan 23 01:10:07.870532 containerd[1649]: time="2026-01-23T01:10:07.870454548Z" level=info msg="CreateContainer within sandbox \"063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 23 01:10:07.888070 containerd[1649]: time="2026-01-23T01:10:07.887578332Z" level=info msg="Container e743101e2994c2922870aadefc2fc3a3edc1caacbd5c65824dc3ecc2139d9a9e: CDI devices from CRI Config.CDIDevices: []" Jan 23 01:10:07.897269 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2394619789.mount: Deactivated successfully. Jan 23 01:10:07.900985 containerd[1649]: time="2026-01-23T01:10:07.900864915Z" level=info msg="CreateContainer within sandbox \"063410f6a99220f369c7ed78845b82cd6d2494fe8cd58d48284ea0f85df9a1c3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"e743101e2994c2922870aadefc2fc3a3edc1caacbd5c65824dc3ecc2139d9a9e\"" Jan 23 01:10:07.901713 containerd[1649]: time="2026-01-23T01:10:07.901688395Z" level=info msg="StartContainer for \"e743101e2994c2922870aadefc2fc3a3edc1caacbd5c65824dc3ecc2139d9a9e\"" Jan 23 01:10:07.903554 containerd[1649]: time="2026-01-23T01:10:07.903084566Z" level=info msg="connecting to shim e743101e2994c2922870aadefc2fc3a3edc1caacbd5c65824dc3ecc2139d9a9e" address="unix:///run/containerd/s/1cff102426f9383e20572203ff4cdc8c69269f3edd5955a4c11bb9dbd611c080" protocol=ttrpc version=3 Jan 23 01:10:07.930691 systemd[1]: Started cri-containerd-e743101e2994c2922870aadefc2fc3a3edc1caacbd5c65824dc3ecc2139d9a9e.scope - libcontainer container e743101e2994c2922870aadefc2fc3a3edc1caacbd5c65824dc3ecc2139d9a9e. Jan 23 01:10:07.987982 containerd[1649]: time="2026-01-23T01:10:07.987398728Z" level=info msg="StartContainer for \"e743101e2994c2922870aadefc2fc3a3edc1caacbd5c65824dc3ecc2139d9a9e\" returns successfully" Jan 23 01:10:11.612271 update_engine[1630]: I20260123 01:10:11.612193 1630 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 23 01:10:11.612271 update_engine[1630]: I20260123 01:10:11.612282 1630 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 23 01:10:11.612735 update_engine[1630]: I20260123 01:10:11.612632 1630 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 23 01:10:11.618387 update_engine[1630]: E20260123 01:10:11.618320 1630 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 23 01:10:11.618579 update_engine[1630]: I20260123 01:10:11.618500 1630 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 23 01:10:12.991406 kubelet[2882]: E0123 01:10:12.990846 2882 controller.go:195] "Failed to update lease" err="Put \"https://10.0.5.114:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-615049e46b?timeout=10s\": context deadline exceeded"