Oct 28 00:28:11.702128 kernel: Linux version 6.12.54-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Oct 27 22:07:04 -00 2025 Oct 28 00:28:11.702143 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=aad2c79de86859b0e91c2fe8391f4e8b4c39a83af1e3d16d6c1a718ff51907cd Oct 28 00:28:11.702150 kernel: Disabled fast string operations Oct 28 00:28:11.702154 kernel: BIOS-provided physical RAM map: Oct 28 00:28:11.702158 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Oct 28 00:28:11.702162 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Oct 28 00:28:11.702167 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Oct 28 00:28:11.702172 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Oct 28 00:28:11.702176 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Oct 28 00:28:11.702180 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Oct 28 00:28:11.702184 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Oct 28 00:28:11.702189 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Oct 28 00:28:11.702193 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Oct 28 00:28:11.702197 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Oct 28 00:28:11.702203 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Oct 28 00:28:11.702208 kernel: NX (Execute Disable) protection: active Oct 28 00:28:11.702212 kernel: APIC: Static calls initialized Oct 28 00:28:11.702217 kernel: SMBIOS 2.7 present. Oct 28 00:28:11.702222 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Oct 28 00:28:11.702227 kernel: DMI: Memory slots populated: 1/128 Oct 28 00:28:11.702232 kernel: vmware: hypercall mode: 0x00 Oct 28 00:28:11.702236 kernel: Hypervisor detected: VMware Oct 28 00:28:11.702241 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Oct 28 00:28:11.702246 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Oct 28 00:28:11.702269 kernel: vmware: using clock offset of 4402334000 ns Oct 28 00:28:11.702274 kernel: tsc: Detected 3408.000 MHz processor Oct 28 00:28:11.702279 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Oct 28 00:28:11.702285 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Oct 28 00:28:11.702289 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Oct 28 00:28:11.702294 kernel: total RAM covered: 3072M Oct 28 00:28:11.702299 kernel: Found optimal setting for mtrr clean up Oct 28 00:28:11.702306 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Oct 28 00:28:11.702311 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Oct 28 00:28:11.702317 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Oct 28 00:28:11.702322 kernel: Using GB pages for direct mapping Oct 28 00:28:11.702327 kernel: ACPI: Early table checksum verification disabled Oct 28 00:28:11.702332 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Oct 28 00:28:11.702337 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Oct 28 00:28:11.702342 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Oct 28 00:28:11.702347 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Oct 28 00:28:11.702354 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Oct 28 00:28:11.702616 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Oct 28 00:28:11.702623 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Oct 28 00:28:11.702629 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Oct 28 00:28:11.702634 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Oct 28 00:28:11.702639 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Oct 28 00:28:11.702645 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Oct 28 00:28:11.702651 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Oct 28 00:28:11.702656 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Oct 28 00:28:11.702661 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Oct 28 00:28:11.702666 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Oct 28 00:28:11.702672 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Oct 28 00:28:11.702676 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Oct 28 00:28:11.702682 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Oct 28 00:28:11.702687 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Oct 28 00:28:11.702692 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Oct 28 00:28:11.702698 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Oct 28 00:28:11.702703 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Oct 28 00:28:11.702708 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Oct 28 00:28:11.702713 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Oct 28 00:28:11.702718 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Oct 28 00:28:11.702724 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Oct 28 00:28:11.702729 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Oct 28 00:28:11.702734 kernel: Zone ranges: Oct 28 00:28:11.702739 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Oct 28 00:28:11.702745 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Oct 28 00:28:11.702750 kernel: Normal empty Oct 28 00:28:11.702755 kernel: Device empty Oct 28 00:28:11.702760 kernel: Movable zone start for each node Oct 28 00:28:11.702765 kernel: Early memory node ranges Oct 28 00:28:11.702770 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Oct 28 00:28:11.702775 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Oct 28 00:28:11.702780 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Oct 28 00:28:11.702785 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Oct 28 00:28:11.702790 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Oct 28 00:28:11.702796 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Oct 28 00:28:11.702801 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Oct 28 00:28:11.702806 kernel: ACPI: PM-Timer IO Port: 0x1008 Oct 28 00:28:11.702811 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Oct 28 00:28:11.702816 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Oct 28 00:28:11.702821 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Oct 28 00:28:11.702826 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Oct 28 00:28:11.702831 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Oct 28 00:28:11.702836 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Oct 28 00:28:11.702841 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Oct 28 00:28:11.702847 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Oct 28 00:28:11.702852 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Oct 28 00:28:11.702856 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Oct 28 00:28:11.702861 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Oct 28 00:28:11.702866 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Oct 28 00:28:11.702871 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Oct 28 00:28:11.702876 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Oct 28 00:28:11.702881 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Oct 28 00:28:11.702886 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Oct 28 00:28:11.702892 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Oct 28 00:28:11.702897 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Oct 28 00:28:11.702902 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Oct 28 00:28:11.702907 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Oct 28 00:28:11.702912 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Oct 28 00:28:11.702917 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Oct 28 00:28:11.702922 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Oct 28 00:28:11.702927 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Oct 28 00:28:11.702932 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Oct 28 00:28:11.702937 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Oct 28 00:28:11.702943 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Oct 28 00:28:11.702947 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Oct 28 00:28:11.702953 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Oct 28 00:28:11.702957 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Oct 28 00:28:11.702963 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Oct 28 00:28:11.702968 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Oct 28 00:28:11.702973 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Oct 28 00:28:11.702977 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Oct 28 00:28:11.702982 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Oct 28 00:28:11.702987 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Oct 28 00:28:11.702994 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Oct 28 00:28:11.702999 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Oct 28 00:28:11.703003 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Oct 28 00:28:11.703009 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Oct 28 00:28:11.703018 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Oct 28 00:28:11.703023 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Oct 28 00:28:11.703029 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Oct 28 00:28:11.703034 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Oct 28 00:28:11.703040 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Oct 28 00:28:11.703046 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Oct 28 00:28:11.703051 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Oct 28 00:28:11.703056 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Oct 28 00:28:11.703061 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Oct 28 00:28:11.703067 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Oct 28 00:28:11.703072 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Oct 28 00:28:11.703077 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Oct 28 00:28:11.703082 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Oct 28 00:28:11.703088 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Oct 28 00:28:11.703094 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Oct 28 00:28:11.703099 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Oct 28 00:28:11.703104 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Oct 28 00:28:11.703109 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Oct 28 00:28:11.703115 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Oct 28 00:28:11.703120 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Oct 28 00:28:11.703125 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Oct 28 00:28:11.703130 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Oct 28 00:28:11.703136 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Oct 28 00:28:11.703141 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Oct 28 00:28:11.703147 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Oct 28 00:28:11.703153 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Oct 28 00:28:11.703158 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Oct 28 00:28:11.703163 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Oct 28 00:28:11.703168 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Oct 28 00:28:11.703174 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Oct 28 00:28:11.703179 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Oct 28 00:28:11.703184 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Oct 28 00:28:11.703189 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Oct 28 00:28:11.703195 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Oct 28 00:28:11.703201 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Oct 28 00:28:11.703206 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Oct 28 00:28:11.703212 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Oct 28 00:28:11.703217 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Oct 28 00:28:11.703222 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Oct 28 00:28:11.703227 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Oct 28 00:28:11.703233 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Oct 28 00:28:11.703238 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Oct 28 00:28:11.703247 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Oct 28 00:28:11.703254 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Oct 28 00:28:11.703259 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Oct 28 00:28:11.703264 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Oct 28 00:28:11.703270 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Oct 28 00:28:11.703275 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Oct 28 00:28:11.703280 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Oct 28 00:28:11.703285 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Oct 28 00:28:11.703291 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Oct 28 00:28:11.703296 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Oct 28 00:28:11.703301 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Oct 28 00:28:11.703307 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Oct 28 00:28:11.703313 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Oct 28 00:28:11.703318 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Oct 28 00:28:11.703323 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Oct 28 00:28:11.703328 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Oct 28 00:28:11.703333 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Oct 28 00:28:11.703339 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Oct 28 00:28:11.703344 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Oct 28 00:28:11.703349 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Oct 28 00:28:11.703355 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Oct 28 00:28:11.703369 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Oct 28 00:28:11.703375 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Oct 28 00:28:11.703381 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Oct 28 00:28:11.703386 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Oct 28 00:28:11.703391 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Oct 28 00:28:11.703397 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Oct 28 00:28:11.703402 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Oct 28 00:28:11.703407 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Oct 28 00:28:11.703431 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Oct 28 00:28:11.703436 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Oct 28 00:28:11.703443 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Oct 28 00:28:11.703449 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Oct 28 00:28:11.703454 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Oct 28 00:28:11.703459 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Oct 28 00:28:11.703465 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Oct 28 00:28:11.703811 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Oct 28 00:28:11.703817 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Oct 28 00:28:11.703823 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Oct 28 00:28:11.703828 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Oct 28 00:28:11.703834 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Oct 28 00:28:11.703841 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Oct 28 00:28:11.703847 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Oct 28 00:28:11.703852 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Oct 28 00:28:11.703857 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Oct 28 00:28:11.703863 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Oct 28 00:28:11.703868 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Oct 28 00:28:11.703874 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Oct 28 00:28:11.703879 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Oct 28 00:28:11.703885 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Oct 28 00:28:11.703891 kernel: TSC deadline timer available Oct 28 00:28:11.703897 kernel: CPU topo: Max. logical packages: 128 Oct 28 00:28:11.703903 kernel: CPU topo: Max. logical dies: 128 Oct 28 00:28:11.703908 kernel: CPU topo: Max. dies per package: 1 Oct 28 00:28:11.703913 kernel: CPU topo: Max. threads per core: 1 Oct 28 00:28:11.703919 kernel: CPU topo: Num. cores per package: 1 Oct 28 00:28:11.703924 kernel: CPU topo: Num. threads per package: 1 Oct 28 00:28:11.703930 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Oct 28 00:28:11.703935 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Oct 28 00:28:11.703941 kernel: Booting paravirtualized kernel on VMware hypervisor Oct 28 00:28:11.703948 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Oct 28 00:28:11.703953 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Oct 28 00:28:11.703959 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Oct 28 00:28:11.703965 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Oct 28 00:28:11.703970 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Oct 28 00:28:11.703976 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Oct 28 00:28:11.703981 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Oct 28 00:28:11.703987 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Oct 28 00:28:11.703992 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Oct 28 00:28:11.703999 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Oct 28 00:28:11.704004 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Oct 28 00:28:11.704009 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Oct 28 00:28:11.704015 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Oct 28 00:28:11.704020 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Oct 28 00:28:11.704025 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Oct 28 00:28:11.704031 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Oct 28 00:28:11.704036 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Oct 28 00:28:11.704043 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Oct 28 00:28:11.704049 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Oct 28 00:28:11.704054 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Oct 28 00:28:11.704060 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=aad2c79de86859b0e91c2fe8391f4e8b4c39a83af1e3d16d6c1a718ff51907cd Oct 28 00:28:11.704066 kernel: random: crng init done Oct 28 00:28:11.704072 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Oct 28 00:28:11.704077 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Oct 28 00:28:11.704083 kernel: printk: log_buf_len min size: 262144 bytes Oct 28 00:28:11.704088 kernel: printk: log_buf_len: 1048576 bytes Oct 28 00:28:11.704095 kernel: printk: early log buf free: 245704(93%) Oct 28 00:28:11.704100 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 28 00:28:11.704106 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Oct 28 00:28:11.704112 kernel: Fallback order for Node 0: 0 Oct 28 00:28:11.704117 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Oct 28 00:28:11.704123 kernel: Policy zone: DMA32 Oct 28 00:28:11.704129 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Oct 28 00:28:11.704134 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Oct 28 00:28:11.704140 kernel: ftrace: allocating 40021 entries in 157 pages Oct 28 00:28:11.704147 kernel: ftrace: allocated 157 pages with 5 groups Oct 28 00:28:11.704152 kernel: Dynamic Preempt: voluntary Oct 28 00:28:11.704158 kernel: rcu: Preemptible hierarchical RCU implementation. Oct 28 00:28:11.704164 kernel: rcu: RCU event tracing is enabled. Oct 28 00:28:11.704169 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Oct 28 00:28:11.704175 kernel: Trampoline variant of Tasks RCU enabled. Oct 28 00:28:11.704181 kernel: Rude variant of Tasks RCU enabled. Oct 28 00:28:11.704186 kernel: Tracing variant of Tasks RCU enabled. Oct 28 00:28:11.704192 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Oct 28 00:28:11.704198 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Oct 28 00:28:11.704204 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 28 00:28:11.704210 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 28 00:28:11.704216 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 28 00:28:11.704221 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Oct 28 00:28:11.704226 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Oct 28 00:28:11.704232 kernel: Console: colour VGA+ 80x25 Oct 28 00:28:11.704238 kernel: printk: legacy console [tty0] enabled Oct 28 00:28:11.704243 kernel: printk: legacy console [ttyS0] enabled Oct 28 00:28:11.704250 kernel: ACPI: Core revision 20240827 Oct 28 00:28:11.704255 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Oct 28 00:28:11.704261 kernel: APIC: Switch to symmetric I/O mode setup Oct 28 00:28:11.704266 kernel: x2apic enabled Oct 28 00:28:11.704272 kernel: APIC: Switched APIC routing to: physical x2apic Oct 28 00:28:11.704278 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Oct 28 00:28:11.704289 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Oct 28 00:28:11.704298 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Oct 28 00:28:11.704304 kernel: Disabled fast string operations Oct 28 00:28:11.704311 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Oct 28 00:28:11.704316 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Oct 28 00:28:11.704322 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Oct 28 00:28:11.704328 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Oct 28 00:28:11.704333 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Oct 28 00:28:11.704345 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Oct 28 00:28:11.704351 kernel: RETBleed: Mitigation: Enhanced IBRS Oct 28 00:28:11.704356 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Oct 28 00:28:11.704443 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Oct 28 00:28:11.704452 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Oct 28 00:28:11.704457 kernel: SRBDS: Unknown: Dependent on hypervisor status Oct 28 00:28:11.704463 kernel: GDS: Unknown: Dependent on hypervisor status Oct 28 00:28:11.704468 kernel: active return thunk: its_return_thunk Oct 28 00:28:11.704474 kernel: ITS: Mitigation: Aligned branch/return thunks Oct 28 00:28:11.704480 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Oct 28 00:28:11.704485 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Oct 28 00:28:11.704491 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Oct 28 00:28:11.704496 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Oct 28 00:28:11.704503 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Oct 28 00:28:11.704508 kernel: Freeing SMP alternatives memory: 32K Oct 28 00:28:11.704514 kernel: pid_max: default: 131072 minimum: 1024 Oct 28 00:28:11.704519 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Oct 28 00:28:11.704525 kernel: landlock: Up and running. Oct 28 00:28:11.704530 kernel: SELinux: Initializing. Oct 28 00:28:11.704536 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 28 00:28:11.704542 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 28 00:28:11.704563 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Oct 28 00:28:11.704569 kernel: Performance Events: Skylake events, core PMU driver. Oct 28 00:28:11.704574 kernel: core: CPUID marked event: 'cpu cycles' unavailable Oct 28 00:28:11.704580 kernel: core: CPUID marked event: 'instructions' unavailable Oct 28 00:28:11.704585 kernel: core: CPUID marked event: 'bus cycles' unavailable Oct 28 00:28:11.704591 kernel: core: CPUID marked event: 'cache references' unavailable Oct 28 00:28:11.704596 kernel: core: CPUID marked event: 'cache misses' unavailable Oct 28 00:28:11.704601 kernel: core: CPUID marked event: 'branch instructions' unavailable Oct 28 00:28:11.704606 kernel: core: CPUID marked event: 'branch misses' unavailable Oct 28 00:28:11.704613 kernel: ... version: 1 Oct 28 00:28:11.704618 kernel: ... bit width: 48 Oct 28 00:28:11.704624 kernel: ... generic registers: 4 Oct 28 00:28:11.704629 kernel: ... value mask: 0000ffffffffffff Oct 28 00:28:11.704634 kernel: ... max period: 000000007fffffff Oct 28 00:28:11.704640 kernel: ... fixed-purpose events: 0 Oct 28 00:28:11.704645 kernel: ... event mask: 000000000000000f Oct 28 00:28:11.704650 kernel: signal: max sigframe size: 1776 Oct 28 00:28:11.704656 kernel: rcu: Hierarchical SRCU implementation. Oct 28 00:28:11.704662 kernel: rcu: Max phase no-delay instances is 400. Oct 28 00:28:11.704668 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Oct 28 00:28:11.704673 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Oct 28 00:28:11.704679 kernel: smp: Bringing up secondary CPUs ... Oct 28 00:28:11.704684 kernel: smpboot: x86: Booting SMP configuration: Oct 28 00:28:11.704689 kernel: .... node #0, CPUs: #1 Oct 28 00:28:11.704695 kernel: Disabled fast string operations Oct 28 00:28:11.704700 kernel: smp: Brought up 1 node, 2 CPUs Oct 28 00:28:11.704706 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Oct 28 00:28:11.704711 kernel: Memory: 1918044K/2096628K available (14336K kernel code, 2436K rwdata, 26048K rodata, 45532K init, 1196K bss, 167200K reserved, 0K cma-reserved) Oct 28 00:28:11.704718 kernel: devtmpfs: initialized Oct 28 00:28:11.704723 kernel: x86/mm: Memory block size: 128MB Oct 28 00:28:11.704729 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Oct 28 00:28:11.704734 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Oct 28 00:28:11.704739 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Oct 28 00:28:11.704745 kernel: pinctrl core: initialized pinctrl subsystem Oct 28 00:28:11.704750 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Oct 28 00:28:11.704756 kernel: audit: initializing netlink subsys (disabled) Oct 28 00:28:11.704761 kernel: audit: type=2000 audit(1761611288.282:1): state=initialized audit_enabled=0 res=1 Oct 28 00:28:11.704767 kernel: thermal_sys: Registered thermal governor 'step_wise' Oct 28 00:28:11.704773 kernel: thermal_sys: Registered thermal governor 'user_space' Oct 28 00:28:11.704778 kernel: cpuidle: using governor menu Oct 28 00:28:11.704783 kernel: Simple Boot Flag at 0x36 set to 0x80 Oct 28 00:28:11.704789 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Oct 28 00:28:11.704795 kernel: dca service started, version 1.12.1 Oct 28 00:28:11.704800 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Oct 28 00:28:11.704812 kernel: PCI: Using configuration type 1 for base access Oct 28 00:28:11.704819 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Oct 28 00:28:11.704826 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Oct 28 00:28:11.704831 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Oct 28 00:28:11.704837 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Oct 28 00:28:11.704842 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Oct 28 00:28:11.704848 kernel: ACPI: Added _OSI(Module Device) Oct 28 00:28:11.704854 kernel: ACPI: Added _OSI(Processor Device) Oct 28 00:28:11.704859 kernel: ACPI: Added _OSI(Processor Aggregator Device) Oct 28 00:28:11.704865 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Oct 28 00:28:11.704871 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Oct 28 00:28:11.704877 kernel: ACPI: Interpreter enabled Oct 28 00:28:11.704883 kernel: ACPI: PM: (supports S0 S1 S5) Oct 28 00:28:11.704888 kernel: ACPI: Using IOAPIC for interrupt routing Oct 28 00:28:11.704895 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Oct 28 00:28:11.704901 kernel: PCI: Using E820 reservations for host bridge windows Oct 28 00:28:11.704907 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Oct 28 00:28:11.704913 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Oct 28 00:28:11.704996 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Oct 28 00:28:11.705050 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Oct 28 00:28:11.705098 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Oct 28 00:28:11.705106 kernel: PCI host bridge to bus 0000:00 Oct 28 00:28:11.705156 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Oct 28 00:28:11.705200 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Oct 28 00:28:11.705242 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Oct 28 00:28:11.705284 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Oct 28 00:28:11.705328 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Oct 28 00:28:11.705382 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Oct 28 00:28:11.705446 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Oct 28 00:28:11.705528 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Oct 28 00:28:11.705587 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 28 00:28:11.705644 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Oct 28 00:28:11.705699 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Oct 28 00:28:11.705748 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Oct 28 00:28:11.705796 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Oct 28 00:28:11.705844 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Oct 28 00:28:11.705894 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Oct 28 00:28:11.705941 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Oct 28 00:28:11.706005 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Oct 28 00:28:11.706056 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Oct 28 00:28:11.706104 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Oct 28 00:28:11.706172 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Oct 28 00:28:11.706224 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Oct 28 00:28:11.706277 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Oct 28 00:28:11.706329 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Oct 28 00:28:11.706394 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Oct 28 00:28:11.706444 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Oct 28 00:28:11.706492 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Oct 28 00:28:11.706539 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Oct 28 00:28:11.706589 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Oct 28 00:28:11.706643 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Oct 28 00:28:11.706691 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Oct 28 00:28:11.706740 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Oct 28 00:28:11.706787 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Oct 28 00:28:11.706835 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 28 00:28:11.706887 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.706940 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 28 00:28:11.706988 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Oct 28 00:28:11.707036 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Oct 28 00:28:11.707084 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.707138 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.707187 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 28 00:28:11.707235 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Oct 28 00:28:11.707286 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Oct 28 00:28:11.707338 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Oct 28 00:28:11.707402 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.707474 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.707524 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 28 00:28:11.707599 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Oct 28 00:28:11.707707 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Oct 28 00:28:11.707766 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Oct 28 00:28:11.707825 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.707885 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.707977 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 28 00:28:11.708025 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Oct 28 00:28:11.708073 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Oct 28 00:28:11.708124 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.708176 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.708225 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 28 00:28:11.708273 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Oct 28 00:28:11.708322 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 28 00:28:11.708383 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.708439 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.708490 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 28 00:28:11.708540 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Oct 28 00:28:11.708608 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Oct 28 00:28:11.708679 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.708741 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.708802 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 28 00:28:11.708852 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Oct 28 00:28:11.708907 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Oct 28 00:28:11.708956 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.709666 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.709736 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 28 00:28:11.709793 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Oct 28 00:28:11.709847 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Oct 28 00:28:11.709898 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.709959 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.710017 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 28 00:28:11.710068 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Oct 28 00:28:11.710117 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Oct 28 00:28:11.710166 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.710221 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.710284 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 28 00:28:11.710339 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Oct 28 00:28:11.710405 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Oct 28 00:28:11.710455 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Oct 28 00:28:11.710504 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.710558 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.710608 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 28 00:28:11.710658 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Oct 28 00:28:11.710706 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Oct 28 00:28:11.710758 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Oct 28 00:28:11.710807 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.710861 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.710911 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 28 00:28:11.710965 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Oct 28 00:28:11.711014 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 28 00:28:11.711064 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.711127 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.711179 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 28 00:28:11.711229 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Oct 28 00:28:11.712438 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 28 00:28:11.712711 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.712776 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.712829 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 28 00:28:11.712883 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Oct 28 00:28:11.712934 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Oct 28 00:28:11.712984 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.713037 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.713088 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 28 00:28:11.713136 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Oct 28 00:28:11.713186 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Oct 28 00:28:11.713238 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.713302 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.713369 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 28 00:28:11.713423 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Oct 28 00:28:11.713473 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 28 00:28:11.713523 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.713579 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.713633 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 28 00:28:11.713685 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Oct 28 00:28:11.713735 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Oct 28 00:28:11.713784 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 28 00:28:11.713833 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.713886 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.713937 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 28 00:28:11.713988 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Oct 28 00:28:11.714038 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Oct 28 00:28:11.714087 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Oct 28 00:28:11.714136 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.714192 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.714242 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 28 00:28:11.714291 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Oct 28 00:28:11.714340 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Oct 28 00:28:11.714403 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Oct 28 00:28:11.714453 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.714506 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.714559 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 28 00:28:11.714607 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Oct 28 00:28:11.714656 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 28 00:28:11.714705 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.714758 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.714808 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 28 00:28:11.714857 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Oct 28 00:28:11.714908 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 28 00:28:11.714957 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.715013 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.715063 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 28 00:28:11.715111 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Oct 28 00:28:11.715160 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Oct 28 00:28:11.715209 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.715262 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.715315 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 28 00:28:11.715378 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Oct 28 00:28:11.715432 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Oct 28 00:28:11.715481 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.715536 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.715586 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 28 00:28:11.715635 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Oct 28 00:28:11.715687 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 28 00:28:11.715736 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.715789 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.715838 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 28 00:28:11.715887 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Oct 28 00:28:11.715936 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Oct 28 00:28:11.715985 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Oct 28 00:28:11.716036 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.716092 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.716142 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 28 00:28:11.716191 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Oct 28 00:28:11.716239 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Oct 28 00:28:11.716292 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Oct 28 00:28:11.716345 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.716414 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.716472 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 28 00:28:11.716529 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Oct 28 00:28:11.716577 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Oct 28 00:28:11.716626 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.716679 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.716728 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 28 00:28:11.716780 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Oct 28 00:28:11.716830 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 28 00:28:11.716880 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.716933 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.716983 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 28 00:28:11.717031 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Oct 28 00:28:11.717080 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Oct 28 00:28:11.717131 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.717187 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.717238 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 28 00:28:11.717287 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Oct 28 00:28:11.717336 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Oct 28 00:28:11.717404 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.717459 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.717513 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 28 00:28:11.717563 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Oct 28 00:28:11.717612 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Oct 28 00:28:11.717661 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.717714 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:28:11.717764 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 28 00:28:11.717813 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Oct 28 00:28:11.717865 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 28 00:28:11.717913 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.717970 kernel: pci_bus 0000:01: extended config space not accessible Oct 28 00:28:11.718022 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 28 00:28:11.718073 kernel: pci_bus 0000:02: extended config space not accessible Oct 28 00:28:11.718083 kernel: acpiphp: Slot [32] registered Oct 28 00:28:11.718089 kernel: acpiphp: Slot [33] registered Oct 28 00:28:11.718095 kernel: acpiphp: Slot [34] registered Oct 28 00:28:11.718103 kernel: acpiphp: Slot [35] registered Oct 28 00:28:11.718108 kernel: acpiphp: Slot [36] registered Oct 28 00:28:11.718115 kernel: acpiphp: Slot [37] registered Oct 28 00:28:11.718121 kernel: acpiphp: Slot [38] registered Oct 28 00:28:11.718126 kernel: acpiphp: Slot [39] registered Oct 28 00:28:11.718132 kernel: acpiphp: Slot [40] registered Oct 28 00:28:11.718138 kernel: acpiphp: Slot [41] registered Oct 28 00:28:11.718144 kernel: acpiphp: Slot [42] registered Oct 28 00:28:11.718150 kernel: acpiphp: Slot [43] registered Oct 28 00:28:11.718156 kernel: acpiphp: Slot [44] registered Oct 28 00:28:11.718163 kernel: acpiphp: Slot [45] registered Oct 28 00:28:11.718169 kernel: acpiphp: Slot [46] registered Oct 28 00:28:11.718174 kernel: acpiphp: Slot [47] registered Oct 28 00:28:11.718180 kernel: acpiphp: Slot [48] registered Oct 28 00:28:11.718186 kernel: acpiphp: Slot [49] registered Oct 28 00:28:11.718192 kernel: acpiphp: Slot [50] registered Oct 28 00:28:11.718198 kernel: acpiphp: Slot [51] registered Oct 28 00:28:11.718204 kernel: acpiphp: Slot [52] registered Oct 28 00:28:11.718209 kernel: acpiphp: Slot [53] registered Oct 28 00:28:11.718217 kernel: acpiphp: Slot [54] registered Oct 28 00:28:11.718223 kernel: acpiphp: Slot [55] registered Oct 28 00:28:11.718228 kernel: acpiphp: Slot [56] registered Oct 28 00:28:11.718234 kernel: acpiphp: Slot [57] registered Oct 28 00:28:11.718240 kernel: acpiphp: Slot [58] registered Oct 28 00:28:11.718245 kernel: acpiphp: Slot [59] registered Oct 28 00:28:11.718251 kernel: acpiphp: Slot [60] registered Oct 28 00:28:11.718257 kernel: acpiphp: Slot [61] registered Oct 28 00:28:11.718263 kernel: acpiphp: Slot [62] registered Oct 28 00:28:11.718269 kernel: acpiphp: Slot [63] registered Oct 28 00:28:11.718322 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Oct 28 00:28:11.718386 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Oct 28 00:28:11.718435 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Oct 28 00:28:11.718485 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Oct 28 00:28:11.718543 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Oct 28 00:28:11.718595 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Oct 28 00:28:11.718654 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Oct 28 00:28:11.718708 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Oct 28 00:28:11.718760 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Oct 28 00:28:11.718810 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Oct 28 00:28:11.718860 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Oct 28 00:28:11.718910 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Oct 28 00:28:11.718961 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 28 00:28:11.719012 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 28 00:28:11.719067 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 28 00:28:11.719122 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 28 00:28:11.719188 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 28 00:28:11.719240 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 28 00:28:11.719297 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 28 00:28:11.719348 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 28 00:28:11.719421 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Oct 28 00:28:11.719477 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Oct 28 00:28:11.719527 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Oct 28 00:28:11.719578 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Oct 28 00:28:11.719629 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Oct 28 00:28:11.719680 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Oct 28 00:28:11.719731 kernel: pci 0000:0b:00.0: supports D1 D2 Oct 28 00:28:11.719781 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Oct 28 00:28:11.719831 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Oct 28 00:28:11.719885 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 28 00:28:11.719937 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 28 00:28:11.719989 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 28 00:28:11.720041 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 28 00:28:11.720092 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 28 00:28:11.720142 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 28 00:28:11.720193 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 28 00:28:11.720254 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 28 00:28:11.720311 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 28 00:28:11.720371 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 28 00:28:11.720423 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 28 00:28:11.720484 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 28 00:28:11.720536 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 28 00:28:11.720591 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 28 00:28:11.720641 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 28 00:28:11.720700 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 28 00:28:11.720750 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 28 00:28:11.720801 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 28 00:28:11.720862 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 28 00:28:11.722439 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 28 00:28:11.722498 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 28 00:28:11.722551 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 28 00:28:11.722606 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 28 00:28:11.722657 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 28 00:28:11.722667 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Oct 28 00:28:11.722673 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Oct 28 00:28:11.722680 kernel: ACPI: PCI: Interrupt link LNKB disabled Oct 28 00:28:11.722686 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Oct 28 00:28:11.722692 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Oct 28 00:28:11.722698 kernel: iommu: Default domain type: Translated Oct 28 00:28:11.722706 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Oct 28 00:28:11.722712 kernel: PCI: Using ACPI for IRQ routing Oct 28 00:28:11.722718 kernel: PCI: pci_cache_line_size set to 64 bytes Oct 28 00:28:11.722724 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Oct 28 00:28:11.722730 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Oct 28 00:28:11.722780 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Oct 28 00:28:11.722829 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Oct 28 00:28:11.722879 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Oct 28 00:28:11.722888 kernel: vgaarb: loaded Oct 28 00:28:11.722896 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Oct 28 00:28:11.722902 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Oct 28 00:28:11.722908 kernel: clocksource: Switched to clocksource tsc-early Oct 28 00:28:11.722914 kernel: VFS: Disk quotas dquot_6.6.0 Oct 28 00:28:11.722920 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Oct 28 00:28:11.722926 kernel: pnp: PnP ACPI init Oct 28 00:28:11.722980 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Oct 28 00:28:11.723026 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Oct 28 00:28:11.723075 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Oct 28 00:28:11.723126 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Oct 28 00:28:11.723174 kernel: pnp 00:06: [dma 2] Oct 28 00:28:11.723223 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Oct 28 00:28:11.723283 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Oct 28 00:28:11.723330 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Oct 28 00:28:11.723340 kernel: pnp: PnP ACPI: found 8 devices Oct 28 00:28:11.723347 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Oct 28 00:28:11.723353 kernel: NET: Registered PF_INET protocol family Oct 28 00:28:11.723399 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Oct 28 00:28:11.723406 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Oct 28 00:28:11.723412 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Oct 28 00:28:11.723417 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Oct 28 00:28:11.723423 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 28 00:28:11.723429 kernel: TCP: Hash tables configured (established 16384 bind 16384) Oct 28 00:28:11.723437 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 28 00:28:11.723443 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 28 00:28:11.723449 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Oct 28 00:28:11.723455 kernel: NET: Registered PF_XDP protocol family Oct 28 00:28:11.723509 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Oct 28 00:28:11.723561 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Oct 28 00:28:11.723613 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Oct 28 00:28:11.723665 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Oct 28 00:28:11.723716 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Oct 28 00:28:11.723769 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Oct 28 00:28:11.723820 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Oct 28 00:28:11.723871 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Oct 28 00:28:11.723921 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Oct 28 00:28:11.723971 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Oct 28 00:28:11.724021 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Oct 28 00:28:11.724071 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Oct 28 00:28:11.724125 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Oct 28 00:28:11.724175 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Oct 28 00:28:11.724225 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Oct 28 00:28:11.724275 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Oct 28 00:28:11.724326 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Oct 28 00:28:11.725586 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Oct 28 00:28:11.725649 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Oct 28 00:28:11.725705 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Oct 28 00:28:11.725761 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Oct 28 00:28:11.725812 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Oct 28 00:28:11.725862 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Oct 28 00:28:11.725912 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Oct 28 00:28:11.725962 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Oct 28 00:28:11.726013 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726063 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726113 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726165 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726215 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726272 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726324 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726405 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726457 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726507 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726557 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726610 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726659 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726709 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726760 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726809 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726859 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.726911 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.726975 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727033 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727097 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727151 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727209 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727264 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727323 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727396 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727456 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727506 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727557 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727606 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727661 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727713 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727763 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727813 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727865 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.727914 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.727964 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.728019 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.728070 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.728120 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.728169 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.728219 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.728270 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.728319 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.730391 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.730467 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.730535 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.730599 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.730664 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.730729 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.730792 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.730855 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.730915 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.730980 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731046 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731112 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731169 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731219 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731269 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731318 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731383 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731437 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731487 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731536 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731587 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731636 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731686 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731735 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731784 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731832 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731884 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.731932 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.731982 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.732045 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.732099 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.732148 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.732219 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.732269 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.732320 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.732385 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.732439 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.732488 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.732537 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:28:11.732586 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:28:11.732637 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 28 00:28:11.732688 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Oct 28 00:28:11.732740 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Oct 28 00:28:11.732788 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Oct 28 00:28:11.732837 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 28 00:28:11.732891 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Oct 28 00:28:11.732943 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 28 00:28:11.732992 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Oct 28 00:28:11.733040 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Oct 28 00:28:11.733089 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Oct 28 00:28:11.733141 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 28 00:28:11.733190 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Oct 28 00:28:11.733239 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Oct 28 00:28:11.733292 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Oct 28 00:28:11.733346 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 28 00:28:11.733408 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Oct 28 00:28:11.733459 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Oct 28 00:28:11.733507 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Oct 28 00:28:11.733556 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 28 00:28:11.733604 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Oct 28 00:28:11.733657 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Oct 28 00:28:11.733706 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 28 00:28:11.733756 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Oct 28 00:28:11.733805 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 28 00:28:11.733853 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 28 00:28:11.733902 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Oct 28 00:28:11.733950 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Oct 28 00:28:11.734002 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 28 00:28:11.734050 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Oct 28 00:28:11.734099 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Oct 28 00:28:11.734148 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 28 00:28:11.734197 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Oct 28 00:28:11.734245 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Oct 28 00:28:11.734298 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Oct 28 00:28:11.734351 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 28 00:28:11.734780 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Oct 28 00:28:11.734834 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Oct 28 00:28:11.734885 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Oct 28 00:28:11.734936 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 28 00:28:11.734986 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Oct 28 00:28:11.735036 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Oct 28 00:28:11.735084 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Oct 28 00:28:11.735135 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 28 00:28:11.735187 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Oct 28 00:28:11.735236 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Oct 28 00:28:11.735285 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Oct 28 00:28:11.735335 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 28 00:28:11.735398 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Oct 28 00:28:11.735450 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 28 00:28:11.735499 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 28 00:28:11.735547 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Oct 28 00:28:11.735596 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 28 00:28:11.735648 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 28 00:28:11.735697 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Oct 28 00:28:11.735745 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Oct 28 00:28:11.735795 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 28 00:28:11.735843 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Oct 28 00:28:11.735891 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Oct 28 00:28:11.735983 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 28 00:28:11.736231 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Oct 28 00:28:11.736323 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 28 00:28:11.736474 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 28 00:28:11.736527 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Oct 28 00:28:11.736576 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Oct 28 00:28:11.736626 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 28 00:28:11.736677 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 28 00:28:11.736725 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Oct 28 00:28:11.736777 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Oct 28 00:28:11.736826 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Oct 28 00:28:11.736876 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 28 00:28:11.736924 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Oct 28 00:28:11.736973 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Oct 28 00:28:11.737021 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Oct 28 00:28:11.737070 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 28 00:28:11.737119 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Oct 28 00:28:11.737168 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 28 00:28:11.737221 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 28 00:28:11.737269 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Oct 28 00:28:11.737318 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 28 00:28:11.737379 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 28 00:28:11.737431 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Oct 28 00:28:11.737480 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Oct 28 00:28:11.737530 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 28 00:28:11.737582 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Oct 28 00:28:11.737631 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Oct 28 00:28:11.737680 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 28 00:28:11.737728 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Oct 28 00:28:11.737776 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 28 00:28:11.737826 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 28 00:28:11.737875 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Oct 28 00:28:11.737923 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Oct 28 00:28:11.737975 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Oct 28 00:28:11.738059 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 28 00:28:11.738109 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Oct 28 00:28:11.738158 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Oct 28 00:28:11.738206 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Oct 28 00:28:11.738255 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 28 00:28:11.738304 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Oct 28 00:28:11.738352 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Oct 28 00:28:11.738423 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 28 00:28:11.738488 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Oct 28 00:28:11.738537 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 28 00:28:11.738588 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 28 00:28:11.738637 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Oct 28 00:28:11.738686 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Oct 28 00:28:11.738735 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 28 00:28:11.738783 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Oct 28 00:28:11.738832 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Oct 28 00:28:11.738885 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 28 00:28:11.738934 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Oct 28 00:28:11.738982 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Oct 28 00:28:11.739032 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 28 00:28:11.739081 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Oct 28 00:28:11.739130 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 28 00:28:11.739182 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Oct 28 00:28:11.739226 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Oct 28 00:28:11.739273 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Oct 28 00:28:11.739321 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Oct 28 00:28:11.739375 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Oct 28 00:28:11.739424 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Oct 28 00:28:11.739469 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Oct 28 00:28:11.739516 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 28 00:28:11.739561 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Oct 28 00:28:11.739614 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Oct 28 00:28:11.739676 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Oct 28 00:28:11.739736 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Oct 28 00:28:11.739794 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Oct 28 00:28:11.739854 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Oct 28 00:28:11.739909 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Oct 28 00:28:11.739960 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Oct 28 00:28:11.740015 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Oct 28 00:28:11.740062 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Oct 28 00:28:11.740107 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Oct 28 00:28:11.740164 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Oct 28 00:28:11.740256 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Oct 28 00:28:11.740306 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Oct 28 00:28:11.740367 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Oct 28 00:28:11.740415 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Oct 28 00:28:11.740467 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Oct 28 00:28:11.740512 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 28 00:28:11.740561 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Oct 28 00:28:11.740606 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Oct 28 00:28:11.740658 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Oct 28 00:28:11.740704 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Oct 28 00:28:11.740753 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Oct 28 00:28:11.740799 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Oct 28 00:28:11.740860 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Oct 28 00:28:11.740909 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Oct 28 00:28:11.740953 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Oct 28 00:28:11.741003 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Oct 28 00:28:11.741049 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Oct 28 00:28:11.741093 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Oct 28 00:28:11.741142 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Oct 28 00:28:11.741187 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Oct 28 00:28:11.741234 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Oct 28 00:28:11.741282 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Oct 28 00:28:11.741328 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 28 00:28:11.741403 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Oct 28 00:28:11.741467 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 28 00:28:11.741523 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Oct 28 00:28:11.741572 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Oct 28 00:28:11.741622 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Oct 28 00:28:11.741668 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Oct 28 00:28:11.741716 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Oct 28 00:28:11.741762 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 28 00:28:11.741810 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Oct 28 00:28:11.741857 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Oct 28 00:28:11.741901 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 28 00:28:11.741951 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Oct 28 00:28:11.741996 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Oct 28 00:28:11.742040 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Oct 28 00:28:11.742088 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Oct 28 00:28:11.742133 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Oct 28 00:28:11.742180 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Oct 28 00:28:11.742228 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Oct 28 00:28:11.742278 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 28 00:28:11.742326 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Oct 28 00:28:11.742434 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 28 00:28:11.742487 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Oct 28 00:28:11.742533 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Oct 28 00:28:11.742584 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Oct 28 00:28:11.742630 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Oct 28 00:28:11.742679 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Oct 28 00:28:11.742725 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 28 00:28:11.742775 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Oct 28 00:28:11.742820 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Oct 28 00:28:11.742868 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Oct 28 00:28:11.742917 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Oct 28 00:28:11.742962 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Oct 28 00:28:11.743006 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Oct 28 00:28:11.743056 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Oct 28 00:28:11.743101 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Oct 28 00:28:11.743150 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Oct 28 00:28:11.743198 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 28 00:28:11.743247 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Oct 28 00:28:11.743292 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Oct 28 00:28:11.743341 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Oct 28 00:28:11.743411 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Oct 28 00:28:11.743464 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Oct 28 00:28:11.743511 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Oct 28 00:28:11.743560 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Oct 28 00:28:11.743605 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 28 00:28:11.743660 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Oct 28 00:28:11.743669 kernel: PCI: CLS 32 bytes, default 64 Oct 28 00:28:11.743675 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Oct 28 00:28:11.743682 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Oct 28 00:28:11.743689 kernel: clocksource: Switched to clocksource tsc Oct 28 00:28:11.743696 kernel: Initialise system trusted keyrings Oct 28 00:28:11.743702 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Oct 28 00:28:11.743708 kernel: Key type asymmetric registered Oct 28 00:28:11.743714 kernel: Asymmetric key parser 'x509' registered Oct 28 00:28:11.743719 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Oct 28 00:28:11.743725 kernel: io scheduler mq-deadline registered Oct 28 00:28:11.743731 kernel: io scheduler kyber registered Oct 28 00:28:11.743737 kernel: io scheduler bfq registered Oct 28 00:28:11.743789 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Oct 28 00:28:11.743841 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.743892 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Oct 28 00:28:11.743942 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.743992 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Oct 28 00:28:11.744041 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744091 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Oct 28 00:28:11.744144 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744194 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Oct 28 00:28:11.744243 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744293 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Oct 28 00:28:11.744342 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744407 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Oct 28 00:28:11.744458 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744511 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Oct 28 00:28:11.744562 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744611 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Oct 28 00:28:11.744661 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744712 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Oct 28 00:28:11.744761 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744811 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Oct 28 00:28:11.744863 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.744913 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Oct 28 00:28:11.744963 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745023 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Oct 28 00:28:11.745076 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745126 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Oct 28 00:28:11.745177 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745227 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Oct 28 00:28:11.745288 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745340 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Oct 28 00:28:11.745414 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745468 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Oct 28 00:28:11.745519 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745570 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Oct 28 00:28:11.745621 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745675 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Oct 28 00:28:11.745725 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745776 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Oct 28 00:28:11.745827 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745878 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Oct 28 00:28:11.745928 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.745979 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Oct 28 00:28:11.746029 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.746082 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Oct 28 00:28:11.746133 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.746183 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Oct 28 00:28:11.746233 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.746284 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Oct 28 00:28:11.746334 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.746399 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Oct 28 00:28:11.746453 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.746503 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Oct 28 00:28:11.746553 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.747381 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Oct 28 00:28:11.747450 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.747520 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Oct 28 00:28:11.747584 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.747642 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Oct 28 00:28:11.747694 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.747745 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Oct 28 00:28:11.747795 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.747847 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Oct 28 00:28:11.747899 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:28:11.747911 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Oct 28 00:28:11.747918 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Oct 28 00:28:11.747925 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Oct 28 00:28:11.747931 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Oct 28 00:28:11.747938 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Oct 28 00:28:11.747944 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Oct 28 00:28:11.748754 kernel: rtc_cmos 00:01: registered as rtc0 Oct 28 00:28:11.748807 kernel: rtc_cmos 00:01: setting system clock to 2025-10-28T00:28:11 UTC (1761611291) Oct 28 00:28:11.748855 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Oct 28 00:28:11.748867 kernel: intel_pstate: CPU model not supported Oct 28 00:28:11.748874 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Oct 28 00:28:11.748880 kernel: NET: Registered PF_INET6 protocol family Oct 28 00:28:11.748887 kernel: Segment Routing with IPv6 Oct 28 00:28:11.748893 kernel: In-situ OAM (IOAM) with IPv6 Oct 28 00:28:11.748899 kernel: NET: Registered PF_PACKET protocol family Oct 28 00:28:11.748906 kernel: Key type dns_resolver registered Oct 28 00:28:11.748912 kernel: IPI shorthand broadcast: enabled Oct 28 00:28:11.748919 kernel: sched_clock: Marking stable (2559003078, 167966096)->(2741196262, -14227088) Oct 28 00:28:11.748926 kernel: registered taskstats version 1 Oct 28 00:28:11.748933 kernel: Loading compiled-in X.509 certificates Oct 28 00:28:11.748939 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.54-flatcar: f41580392593377dbc2468f7a4cdd729eee34f1f' Oct 28 00:28:11.748945 kernel: Demotion targets for Node 0: null Oct 28 00:28:11.748951 kernel: Key type .fscrypt registered Oct 28 00:28:11.748958 kernel: Key type fscrypt-provisioning registered Oct 28 00:28:11.748964 kernel: ima: No TPM chip found, activating TPM-bypass! Oct 28 00:28:11.748970 kernel: ima: Allocated hash algorithm: sha1 Oct 28 00:28:11.748978 kernel: ima: No architecture policies found Oct 28 00:28:11.748985 kernel: clk: Disabling unused clocks Oct 28 00:28:11.748991 kernel: Warning: unable to open an initial console. Oct 28 00:28:11.748998 kernel: Freeing unused kernel image (initmem) memory: 45532K Oct 28 00:28:11.749004 kernel: Write protecting the kernel read-only data: 40960k Oct 28 00:28:11.749010 kernel: Freeing unused kernel image (rodata/data gap) memory: 576K Oct 28 00:28:11.749017 kernel: Run /init as init process Oct 28 00:28:11.749023 kernel: with arguments: Oct 28 00:28:11.749029 kernel: /init Oct 28 00:28:11.749036 kernel: with environment: Oct 28 00:28:11.749043 kernel: HOME=/ Oct 28 00:28:11.749049 kernel: TERM=linux Oct 28 00:28:11.749056 systemd[1]: Successfully made /usr/ read-only. Oct 28 00:28:11.749065 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 28 00:28:11.749072 systemd[1]: Detected virtualization vmware. Oct 28 00:28:11.749079 systemd[1]: Detected architecture x86-64. Oct 28 00:28:11.749085 systemd[1]: Running in initrd. Oct 28 00:28:11.749092 systemd[1]: No hostname configured, using default hostname. Oct 28 00:28:11.749099 systemd[1]: Hostname set to . Oct 28 00:28:11.749106 systemd[1]: Initializing machine ID from random generator. Oct 28 00:28:11.749112 systemd[1]: Queued start job for default target initrd.target. Oct 28 00:28:11.749119 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 28 00:28:11.749125 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 28 00:28:11.749133 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Oct 28 00:28:11.749140 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 28 00:28:11.749147 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Oct 28 00:28:11.749154 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Oct 28 00:28:11.749162 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Oct 28 00:28:11.749168 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Oct 28 00:28:11.749175 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 28 00:28:11.749182 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 28 00:28:11.749188 systemd[1]: Reached target paths.target - Path Units. Oct 28 00:28:11.749196 systemd[1]: Reached target slices.target - Slice Units. Oct 28 00:28:11.749202 systemd[1]: Reached target swap.target - Swaps. Oct 28 00:28:11.749209 systemd[1]: Reached target timers.target - Timer Units. Oct 28 00:28:11.749216 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Oct 28 00:28:11.749223 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 28 00:28:11.749230 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Oct 28 00:28:11.749236 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Oct 28 00:28:11.749243 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 28 00:28:11.749249 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 28 00:28:11.749257 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 28 00:28:11.749264 systemd[1]: Reached target sockets.target - Socket Units. Oct 28 00:28:11.749271 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Oct 28 00:28:11.749277 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 28 00:28:11.749284 systemd[1]: Finished network-cleanup.service - Network Cleanup. Oct 28 00:28:11.749290 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Oct 28 00:28:11.749297 systemd[1]: Starting systemd-fsck-usr.service... Oct 28 00:28:11.749304 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 28 00:28:11.749311 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 28 00:28:11.749318 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:28:11.749325 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Oct 28 00:28:11.749332 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 28 00:28:11.749338 systemd[1]: Finished systemd-fsck-usr.service. Oct 28 00:28:11.749364 systemd-journald[225]: Collecting audit messages is disabled. Oct 28 00:28:11.750903 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 28 00:28:11.750912 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:28:11.750920 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 28 00:28:11.750929 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 28 00:28:11.750936 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Oct 28 00:28:11.750943 kernel: Bridge firewalling registered Oct 28 00:28:11.750950 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 28 00:28:11.750957 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 28 00:28:11.750963 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 28 00:28:11.750972 systemd-journald[225]: Journal started Oct 28 00:28:11.750987 systemd-journald[225]: Runtime Journal (/run/log/journal/ba19a47675f248e89ee76fe056c48f07) is 4.8M, max 38.5M, 33.7M free. Oct 28 00:28:11.707907 systemd-modules-load[226]: Inserted module 'overlay' Oct 28 00:28:11.740874 systemd-modules-load[226]: Inserted module 'br_netfilter' Oct 28 00:28:11.754339 systemd[1]: Started systemd-journald.service - Journal Service. Oct 28 00:28:11.756582 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 28 00:28:11.758237 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Oct 28 00:28:11.760424 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 28 00:28:11.761207 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 28 00:28:11.761592 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 28 00:28:11.768805 systemd-tmpfiles[258]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Oct 28 00:28:11.771949 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 28 00:28:11.773428 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 28 00:28:11.775120 dracut-cmdline[257]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=aad2c79de86859b0e91c2fe8391f4e8b4c39a83af1e3d16d6c1a718ff51907cd Oct 28 00:28:11.802991 systemd-resolved[272]: Positive Trust Anchors: Oct 28 00:28:11.803205 systemd-resolved[272]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 28 00:28:11.803228 systemd-resolved[272]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 28 00:28:11.805752 systemd-resolved[272]: Defaulting to hostname 'linux'. Oct 28 00:28:11.806406 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 28 00:28:11.806560 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 28 00:28:11.832382 kernel: SCSI subsystem initialized Oct 28 00:28:11.849392 kernel: Loading iSCSI transport class v2.0-870. Oct 28 00:28:11.857377 kernel: iscsi: registered transport (tcp) Oct 28 00:28:11.879748 kernel: iscsi: registered transport (qla4xxx) Oct 28 00:28:11.879795 kernel: QLogic iSCSI HBA Driver Oct 28 00:28:11.890613 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 28 00:28:11.904457 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 28 00:28:11.905527 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 28 00:28:11.928026 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Oct 28 00:28:11.928856 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Oct 28 00:28:11.967380 kernel: raid6: avx2x4 gen() 47106 MB/s Oct 28 00:28:11.983380 kernel: raid6: avx2x2 gen() 52911 MB/s Oct 28 00:28:12.000562 kernel: raid6: avx2x1 gen() 44529 MB/s Oct 28 00:28:12.000598 kernel: raid6: using algorithm avx2x2 gen() 52911 MB/s Oct 28 00:28:12.018579 kernel: raid6: .... xor() 31962 MB/s, rmw enabled Oct 28 00:28:12.018625 kernel: raid6: using avx2x2 recovery algorithm Oct 28 00:28:12.032376 kernel: xor: automatically using best checksumming function avx Oct 28 00:28:12.136378 kernel: Btrfs loaded, zoned=no, fsverity=no Oct 28 00:28:12.139819 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Oct 28 00:28:12.140749 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 28 00:28:12.161394 systemd-udevd[474]: Using default interface naming scheme 'v255'. Oct 28 00:28:12.164792 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 28 00:28:12.166114 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Oct 28 00:28:12.177724 dracut-pre-trigger[480]: rd.md=0: removing MD RAID activation Oct 28 00:28:12.191306 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Oct 28 00:28:12.192249 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 28 00:28:12.269619 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 28 00:28:12.271100 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Oct 28 00:28:12.343375 kernel: VMware PVSCSI driver - version 1.0.7.0-k Oct 28 00:28:12.347540 kernel: vmw_pvscsi: using 64bit dma Oct 28 00:28:12.347570 kernel: vmw_pvscsi: max_id: 16 Oct 28 00:28:12.347578 kernel: vmw_pvscsi: setting ring_pages to 8 Oct 28 00:28:12.350371 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Oct 28 00:28:12.357413 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Oct 28 00:28:12.359378 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Oct 28 00:28:12.366158 kernel: vmw_pvscsi: enabling reqCallThreshold Oct 28 00:28:12.366191 kernel: vmw_pvscsi: driver-based request coalescing enabled Oct 28 00:28:12.366204 kernel: vmw_pvscsi: using MSI-X Oct 28 00:28:12.366212 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Oct 28 00:28:12.371839 (udev-worker)[521]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Oct 28 00:28:12.373376 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Oct 28 00:28:12.373481 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Oct 28 00:28:12.378475 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Oct 28 00:28:12.378750 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 28 00:28:12.379007 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:28:12.379384 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:28:12.381431 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:28:12.391586 kernel: cryptd: max_cpu_qlen set to 1000 Oct 28 00:28:12.405382 kernel: AES CTR mode by8 optimization enabled Oct 28 00:28:12.409339 kernel: libata version 3.00 loaded. Oct 28 00:28:12.409373 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Oct 28 00:28:12.409943 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:28:12.412716 kernel: ata_piix 0000:00:07.1: version 2.13 Oct 28 00:28:12.412823 kernel: scsi host1: ata_piix Oct 28 00:28:12.414140 kernel: scsi host2: ata_piix Oct 28 00:28:12.414238 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Oct 28 00:28:12.415580 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Oct 28 00:28:12.422518 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Oct 28 00:28:12.422640 kernel: sd 0:0:0:0: [sda] Write Protect is off Oct 28 00:28:12.422706 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Oct 28 00:28:12.422767 kernel: sd 0:0:0:0: [sda] Cache data unavailable Oct 28 00:28:12.423843 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Oct 28 00:28:12.437370 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 28 00:28:12.438373 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Oct 28 00:28:12.582382 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Oct 28 00:28:12.588390 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Oct 28 00:28:12.620399 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Oct 28 00:28:12.620516 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Oct 28 00:28:12.631370 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Oct 28 00:28:12.636129 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Oct 28 00:28:12.640800 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Oct 28 00:28:12.641072 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Oct 28 00:28:12.646646 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Oct 28 00:28:12.652305 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Oct 28 00:28:12.652875 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Oct 28 00:28:12.690482 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 28 00:28:12.703376 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 28 00:28:12.864052 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Oct 28 00:28:12.864863 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Oct 28 00:28:12.865143 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 28 00:28:12.865422 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 28 00:28:12.866111 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Oct 28 00:28:12.883024 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Oct 28 00:28:13.701396 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 28 00:28:13.701904 disk-uuid[627]: The operation has completed successfully. Oct 28 00:28:13.741048 systemd[1]: disk-uuid.service: Deactivated successfully. Oct 28 00:28:13.741109 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Oct 28 00:28:13.761527 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Oct 28 00:28:13.771092 sh[657]: Success Oct 28 00:28:13.784370 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Oct 28 00:28:13.784392 kernel: device-mapper: uevent: version 1.0.3 Oct 28 00:28:13.786371 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Oct 28 00:28:13.792385 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Oct 28 00:28:13.830697 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Oct 28 00:28:13.832405 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Oct 28 00:28:13.840060 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Oct 28 00:28:13.850375 kernel: BTRFS: device fsid a092f950-3dee-4030-bc80-a684863b9901 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (669) Oct 28 00:28:13.852544 kernel: BTRFS info (device dm-0): first mount of filesystem a092f950-3dee-4030-bc80-a684863b9901 Oct 28 00:28:13.852563 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:28:13.861377 kernel: BTRFS info (device dm-0): enabling ssd optimizations Oct 28 00:28:13.861415 kernel: BTRFS info (device dm-0): disabling log replay at mount time Oct 28 00:28:13.861424 kernel: BTRFS info (device dm-0): enabling free space tree Oct 28 00:28:13.863636 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Oct 28 00:28:13.864118 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Oct 28 00:28:13.864854 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Oct 28 00:28:13.866416 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Oct 28 00:28:13.888452 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (692) Oct 28 00:28:13.888496 kernel: BTRFS info (device sda6): first mount of filesystem 3702411a-a571-4fb5-bf42-f90f8e62f927 Oct 28 00:28:13.888511 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:28:13.897068 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 28 00:28:13.897124 kernel: BTRFS info (device sda6): enabling free space tree Oct 28 00:28:13.902385 kernel: BTRFS info (device sda6): last unmount of filesystem 3702411a-a571-4fb5-bf42-f90f8e62f927 Oct 28 00:28:13.907961 systemd[1]: Finished ignition-setup.service - Ignition (setup). Oct 28 00:28:13.909100 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Oct 28 00:28:13.964987 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Oct 28 00:28:13.965650 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Oct 28 00:28:14.055777 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 28 00:28:14.057160 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 28 00:28:14.079441 ignition[711]: Ignition 2.22.0 Oct 28 00:28:14.079447 ignition[711]: Stage: fetch-offline Oct 28 00:28:14.079466 ignition[711]: no configs at "/usr/lib/ignition/base.d" Oct 28 00:28:14.079471 ignition[711]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:28:14.079517 ignition[711]: parsed url from cmdline: "" Oct 28 00:28:14.079519 ignition[711]: no config URL provided Oct 28 00:28:14.079522 ignition[711]: reading system config file "/usr/lib/ignition/user.ign" Oct 28 00:28:14.079526 ignition[711]: no config at "/usr/lib/ignition/user.ign" Oct 28 00:28:14.081758 systemd-networkd[848]: lo: Link UP Oct 28 00:28:14.079939 ignition[711]: config successfully fetched Oct 28 00:28:14.081760 systemd-networkd[848]: lo: Gained carrier Oct 28 00:28:14.079957 ignition[711]: parsing config with SHA512: 6fc52307e9e24930bfe322ac55ce9adffeda70bcca97d8b3ac752a84fdc96bc108922c73cba3171144eeb0472a36da4c3bbad0e8b23651648524131ccc265f58 Oct 28 00:28:14.083784 systemd-networkd[848]: Enumeration completed Oct 28 00:28:14.083843 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 28 00:28:14.084015 systemd-networkd[848]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Oct 28 00:28:14.084180 systemd[1]: Reached target network.target - Network. Oct 28 00:28:14.086368 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Oct 28 00:28:14.086476 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Oct 28 00:28:14.088356 unknown[711]: fetched base config from "system" Oct 28 00:28:14.088637 systemd-networkd[848]: ens192: Link UP Oct 28 00:28:14.088639 systemd-networkd[848]: ens192: Gained carrier Oct 28 00:28:14.088796 unknown[711]: fetched user config from "vmware" Oct 28 00:28:14.089011 ignition[711]: fetch-offline: fetch-offline passed Oct 28 00:28:14.089044 ignition[711]: Ignition finished successfully Oct 28 00:28:14.090104 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Oct 28 00:28:14.090440 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Oct 28 00:28:14.091190 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Oct 28 00:28:14.105913 ignition[853]: Ignition 2.22.0 Oct 28 00:28:14.105920 ignition[853]: Stage: kargs Oct 28 00:28:14.105993 ignition[853]: no configs at "/usr/lib/ignition/base.d" Oct 28 00:28:14.105998 ignition[853]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:28:14.107128 ignition[853]: kargs: kargs passed Oct 28 00:28:14.107259 ignition[853]: Ignition finished successfully Oct 28 00:28:14.108547 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Oct 28 00:28:14.109508 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Oct 28 00:28:14.129790 ignition[860]: Ignition 2.22.0 Oct 28 00:28:14.129800 ignition[860]: Stage: disks Oct 28 00:28:14.129879 ignition[860]: no configs at "/usr/lib/ignition/base.d" Oct 28 00:28:14.129885 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:28:14.130495 ignition[860]: disks: disks passed Oct 28 00:28:14.130524 ignition[860]: Ignition finished successfully Oct 28 00:28:14.131380 systemd[1]: Finished ignition-disks.service - Ignition (disks). Oct 28 00:28:14.131721 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Oct 28 00:28:14.132097 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Oct 28 00:28:14.132377 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 28 00:28:14.132618 systemd[1]: Reached target sysinit.target - System Initialization. Oct 28 00:28:14.132843 systemd[1]: Reached target basic.target - Basic System. Oct 28 00:28:14.133549 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Oct 28 00:28:14.154427 systemd-fsck[868]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Oct 28 00:28:14.156097 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Oct 28 00:28:14.157006 systemd[1]: Mounting sysroot.mount - /sysroot... Oct 28 00:28:14.231039 systemd[1]: Mounted sysroot.mount - /sysroot. Oct 28 00:28:14.231638 kernel: EXT4-fs (sda9): mounted filesystem de2a803a-3065-4213-8ac2-f1a57806d341 r/w with ordered data mode. Quota mode: none. Oct 28 00:28:14.231497 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Oct 28 00:28:14.232484 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 28 00:28:14.234395 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Oct 28 00:28:14.234817 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Oct 28 00:28:14.235023 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Oct 28 00:28:14.235038 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Oct 28 00:28:14.243226 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Oct 28 00:28:14.244269 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Oct 28 00:28:14.251706 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (876) Oct 28 00:28:14.251730 kernel: BTRFS info (device sda6): first mount of filesystem 3702411a-a571-4fb5-bf42-f90f8e62f927 Oct 28 00:28:14.251739 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:28:14.258455 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 28 00:28:14.258477 kernel: BTRFS info (device sda6): enabling free space tree Oct 28 00:28:14.259266 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 28 00:28:14.277378 initrd-setup-root[900]: cut: /sysroot/etc/passwd: No such file or directory Oct 28 00:28:14.279701 initrd-setup-root[907]: cut: /sysroot/etc/group: No such file or directory Oct 28 00:28:14.282526 initrd-setup-root[914]: cut: /sysroot/etc/shadow: No such file or directory Oct 28 00:28:14.284960 initrd-setup-root[921]: cut: /sysroot/etc/gshadow: No such file or directory Oct 28 00:28:14.338614 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Oct 28 00:28:14.339593 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Oct 28 00:28:14.340422 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Oct 28 00:28:14.348376 kernel: BTRFS info (device sda6): last unmount of filesystem 3702411a-a571-4fb5-bf42-f90f8e62f927 Oct 28 00:28:14.361775 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Oct 28 00:28:14.363353 ignition[988]: INFO : Ignition 2.22.0 Oct 28 00:28:14.363353 ignition[988]: INFO : Stage: mount Oct 28 00:28:14.363659 ignition[988]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 28 00:28:14.363659 ignition[988]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:28:14.364048 ignition[988]: INFO : mount: mount passed Oct 28 00:28:14.364410 ignition[988]: INFO : Ignition finished successfully Oct 28 00:28:14.365099 systemd[1]: Finished ignition-mount.service - Ignition (mount). Oct 28 00:28:14.366048 systemd[1]: Starting ignition-files.service - Ignition (files)... Oct 28 00:28:14.849441 systemd[1]: sysroot-oem.mount: Deactivated successfully. Oct 28 00:28:14.850691 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 28 00:28:14.864374 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1000) Oct 28 00:28:14.866808 kernel: BTRFS info (device sda6): first mount of filesystem 3702411a-a571-4fb5-bf42-f90f8e62f927 Oct 28 00:28:14.866832 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:28:14.870427 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 28 00:28:14.870449 kernel: BTRFS info (device sda6): enabling free space tree Oct 28 00:28:14.871535 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 28 00:28:14.892028 ignition[1017]: INFO : Ignition 2.22.0 Oct 28 00:28:14.892028 ignition[1017]: INFO : Stage: files Oct 28 00:28:14.892393 ignition[1017]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 28 00:28:14.892393 ignition[1017]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:28:14.892700 ignition[1017]: DEBUG : files: compiled without relabeling support, skipping Oct 28 00:28:14.897602 ignition[1017]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Oct 28 00:28:14.897602 ignition[1017]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Oct 28 00:28:14.908711 ignition[1017]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Oct 28 00:28:14.908980 ignition[1017]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Oct 28 00:28:14.909114 ignition[1017]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Oct 28 00:28:14.908993 unknown[1017]: wrote ssh authorized keys file for user: core Oct 28 00:28:14.912607 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 28 00:28:14.912797 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Oct 28 00:28:14.964453 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 28 00:28:15.035080 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 28 00:28:15.037505 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Oct 28 00:28:15.037690 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Oct 28 00:28:15.037690 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 28 00:28:15.039802 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 28 00:28:15.039802 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 28 00:28:15.040237 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Oct 28 00:28:15.504949 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Oct 28 00:28:15.816575 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 28 00:28:15.816901 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Oct 28 00:28:15.817832 ignition[1017]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Oct 28 00:28:15.818007 ignition[1017]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Oct 28 00:28:15.820511 ignition[1017]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 28 00:28:15.821136 ignition[1017]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 28 00:28:15.821136 ignition[1017]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Oct 28 00:28:15.821136 ignition[1017]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Oct 28 00:28:15.821573 ignition[1017]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Oct 28 00:28:15.821573 ignition[1017]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Oct 28 00:28:15.821573 ignition[1017]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Oct 28 00:28:15.821573 ignition[1017]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Oct 28 00:28:15.891389 ignition[1017]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Oct 28 00:28:15.894233 ignition[1017]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Oct 28 00:28:15.894479 ignition[1017]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Oct 28 00:28:15.894479 ignition[1017]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Oct 28 00:28:15.894479 ignition[1017]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Oct 28 00:28:15.894479 ignition[1017]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Oct 28 00:28:15.896100 ignition[1017]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Oct 28 00:28:15.896100 ignition[1017]: INFO : files: files passed Oct 28 00:28:15.896100 ignition[1017]: INFO : Ignition finished successfully Oct 28 00:28:15.895929 systemd[1]: Finished ignition-files.service - Ignition (files). Oct 28 00:28:15.897337 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Oct 28 00:28:15.898462 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Oct 28 00:28:15.909718 systemd[1]: ignition-quench.service: Deactivated successfully. Oct 28 00:28:15.910406 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Oct 28 00:28:15.912953 initrd-setup-root-after-ignition[1048]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 28 00:28:15.912953 initrd-setup-root-after-ignition[1048]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Oct 28 00:28:15.913717 initrd-setup-root-after-ignition[1052]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 28 00:28:15.914516 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 28 00:28:15.914976 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Oct 28 00:28:15.915524 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Oct 28 00:28:15.940507 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Oct 28 00:28:15.940583 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Oct 28 00:28:15.940846 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Oct 28 00:28:15.941108 systemd[1]: Reached target initrd.target - Initrd Default Target. Oct 28 00:28:15.941314 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Oct 28 00:28:15.941776 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Oct 28 00:28:15.950760 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 28 00:28:15.951648 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Oct 28 00:28:15.969731 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Oct 28 00:28:15.969945 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 28 00:28:15.970179 systemd[1]: Stopped target timers.target - Timer Units. Oct 28 00:28:15.970390 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Oct 28 00:28:15.970470 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 28 00:28:15.970831 systemd[1]: Stopped target initrd.target - Initrd Default Target. Oct 28 00:28:15.970993 systemd[1]: Stopped target basic.target - Basic System. Oct 28 00:28:15.971180 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Oct 28 00:28:15.971389 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Oct 28 00:28:15.971574 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Oct 28 00:28:15.971786 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Oct 28 00:28:15.971987 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Oct 28 00:28:15.972193 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Oct 28 00:28:15.972419 systemd[1]: Stopped target sysinit.target - System Initialization. Oct 28 00:28:15.972638 systemd[1]: Stopped target local-fs.target - Local File Systems. Oct 28 00:28:15.972837 systemd[1]: Stopped target swap.target - Swaps. Oct 28 00:28:15.973007 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Oct 28 00:28:15.973078 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Oct 28 00:28:15.973392 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Oct 28 00:28:15.973624 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 28 00:28:15.973816 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Oct 28 00:28:15.973865 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 28 00:28:15.974048 systemd[1]: dracut-initqueue.service: Deactivated successfully. Oct 28 00:28:15.974111 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Oct 28 00:28:15.974406 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Oct 28 00:28:15.974476 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Oct 28 00:28:15.974703 systemd[1]: Stopped target paths.target - Path Units. Oct 28 00:28:15.974855 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Oct 28 00:28:15.978400 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 28 00:28:15.978622 systemd[1]: Stopped target slices.target - Slice Units. Oct 28 00:28:15.978853 systemd[1]: Stopped target sockets.target - Socket Units. Oct 28 00:28:15.979022 systemd[1]: iscsid.socket: Deactivated successfully. Oct 28 00:28:15.979080 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Oct 28 00:28:15.979240 systemd[1]: iscsiuio.socket: Deactivated successfully. Oct 28 00:28:15.979286 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 28 00:28:15.979481 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Oct 28 00:28:15.979557 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 28 00:28:15.979802 systemd[1]: ignition-files.service: Deactivated successfully. Oct 28 00:28:15.979865 systemd[1]: Stopped ignition-files.service - Ignition (files). Oct 28 00:28:15.980556 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Oct 28 00:28:15.980674 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Oct 28 00:28:15.980739 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Oct 28 00:28:15.981307 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Oct 28 00:28:15.982414 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Oct 28 00:28:15.982497 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Oct 28 00:28:15.982736 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Oct 28 00:28:15.982795 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Oct 28 00:28:15.985546 systemd[1]: initrd-cleanup.service: Deactivated successfully. Oct 28 00:28:15.990772 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Oct 28 00:28:16.000497 systemd[1]: sysroot-boot.mount: Deactivated successfully. Oct 28 00:28:16.002928 ignition[1072]: INFO : Ignition 2.22.0 Oct 28 00:28:16.002928 ignition[1072]: INFO : Stage: umount Oct 28 00:28:16.002928 ignition[1072]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 28 00:28:16.002928 ignition[1072]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:28:16.002928 ignition[1072]: INFO : umount: umount passed Oct 28 00:28:16.002928 ignition[1072]: INFO : Ignition finished successfully Oct 28 00:28:16.005523 systemd[1]: ignition-mount.service: Deactivated successfully. Oct 28 00:28:16.005591 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Oct 28 00:28:16.006060 systemd[1]: Stopped target network.target - Network. Oct 28 00:28:16.006280 systemd[1]: ignition-disks.service: Deactivated successfully. Oct 28 00:28:16.006311 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Oct 28 00:28:16.006495 systemd[1]: ignition-kargs.service: Deactivated successfully. Oct 28 00:28:16.006517 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Oct 28 00:28:16.006716 systemd[1]: ignition-setup.service: Deactivated successfully. Oct 28 00:28:16.006743 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Oct 28 00:28:16.006876 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Oct 28 00:28:16.006900 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Oct 28 00:28:16.007135 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Oct 28 00:28:16.007286 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Oct 28 00:28:16.012503 systemd[1]: systemd-resolved.service: Deactivated successfully. Oct 28 00:28:16.012572 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Oct 28 00:28:16.013907 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Oct 28 00:28:16.014044 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Oct 28 00:28:16.014070 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 28 00:28:16.014857 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Oct 28 00:28:16.019183 systemd[1]: systemd-networkd.service: Deactivated successfully. Oct 28 00:28:16.019253 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Oct 28 00:28:16.019940 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Oct 28 00:28:16.020038 systemd[1]: Stopped target network-pre.target - Preparation for Network. Oct 28 00:28:16.020190 systemd[1]: systemd-networkd.socket: Deactivated successfully. Oct 28 00:28:16.020208 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Oct 28 00:28:16.020949 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Oct 28 00:28:16.021051 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Oct 28 00:28:16.021078 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 28 00:28:16.021206 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Oct 28 00:28:16.021228 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Oct 28 00:28:16.021356 systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 28 00:28:16.021390 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Oct 28 00:28:16.021543 systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 28 00:28:16.021565 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Oct 28 00:28:16.022963 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 28 00:28:16.023985 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Oct 28 00:28:16.032555 systemd[1]: network-cleanup.service: Deactivated successfully. Oct 28 00:28:16.032626 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Oct 28 00:28:16.032913 systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 28 00:28:16.032994 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 28 00:28:16.033387 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Oct 28 00:28:16.033417 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Oct 28 00:28:16.033536 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Oct 28 00:28:16.033552 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Oct 28 00:28:16.033713 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Oct 28 00:28:16.033736 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Oct 28 00:28:16.034004 systemd[1]: dracut-cmdline.service: Deactivated successfully. Oct 28 00:28:16.034028 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Oct 28 00:28:16.034318 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Oct 28 00:28:16.034342 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 28 00:28:16.035430 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Oct 28 00:28:16.035554 systemd[1]: systemd-network-generator.service: Deactivated successfully. Oct 28 00:28:16.035579 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Oct 28 00:28:16.036440 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Oct 28 00:28:16.036471 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 28 00:28:16.037547 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 28 00:28:16.037575 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:28:16.044419 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Oct 28 00:28:16.044630 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Oct 28 00:28:16.074430 systemd[1]: sysroot-boot.service: Deactivated successfully. Oct 28 00:28:16.074496 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Oct 28 00:28:16.074747 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Oct 28 00:28:16.074837 systemd[1]: initrd-setup-root.service: Deactivated successfully. Oct 28 00:28:16.074864 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Oct 28 00:28:16.075966 systemd[1]: Starting initrd-switch-root.service - Switch Root... Oct 28 00:28:16.089502 systemd[1]: Switching root. Oct 28 00:28:16.130160 systemd-journald[225]: Journal stopped Oct 28 00:28:17.358277 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Oct 28 00:28:17.358298 kernel: SELinux: policy capability network_peer_controls=1 Oct 28 00:28:17.358306 kernel: SELinux: policy capability open_perms=1 Oct 28 00:28:17.358311 kernel: SELinux: policy capability extended_socket_class=1 Oct 28 00:28:17.358317 kernel: SELinux: policy capability always_check_network=0 Oct 28 00:28:17.358323 kernel: SELinux: policy capability cgroup_seclabel=1 Oct 28 00:28:17.358329 kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 28 00:28:17.358336 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Oct 28 00:28:17.358342 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Oct 28 00:28:17.358348 kernel: SELinux: policy capability userspace_initial_context=0 Oct 28 00:28:17.358354 kernel: audit: type=1403 audit(1761611296.802:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Oct 28 00:28:17.358367 systemd[1]: Successfully loaded SELinux policy in 58.777ms. Oct 28 00:28:17.358374 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.658ms. Oct 28 00:28:17.358383 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 28 00:28:17.358390 systemd[1]: Detected virtualization vmware. Oct 28 00:28:17.358397 systemd[1]: Detected architecture x86-64. Oct 28 00:28:17.358403 systemd[1]: Detected first boot. Oct 28 00:28:17.358409 systemd[1]: Initializing machine ID from random generator. Oct 28 00:28:17.358417 zram_generator::config[1116]: No configuration found. Oct 28 00:28:17.358502 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Oct 28 00:28:17.358513 kernel: Guest personality initialized and is active Oct 28 00:28:17.358519 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Oct 28 00:28:17.358525 kernel: Initialized host personality Oct 28 00:28:17.358532 kernel: NET: Registered PF_VSOCK protocol family Oct 28 00:28:17.358541 systemd[1]: Populated /etc with preset unit settings. Oct 28 00:28:17.358548 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:28:17.358555 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Oct 28 00:28:17.358562 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Oct 28 00:28:17.358569 systemd[1]: initrd-switch-root.service: Deactivated successfully. Oct 28 00:28:17.358575 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Oct 28 00:28:17.358582 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Oct 28 00:28:17.358590 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Oct 28 00:28:17.358597 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Oct 28 00:28:17.358604 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Oct 28 00:28:17.358611 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Oct 28 00:28:17.358617 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Oct 28 00:28:17.358624 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Oct 28 00:28:17.358631 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Oct 28 00:28:17.358637 systemd[1]: Created slice user.slice - User and Session Slice. Oct 28 00:28:17.358645 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 28 00:28:17.358653 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 28 00:28:17.358661 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Oct 28 00:28:17.358667 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Oct 28 00:28:17.358674 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Oct 28 00:28:17.358681 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 28 00:28:17.358688 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Oct 28 00:28:17.358696 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 28 00:28:17.358703 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 28 00:28:17.358710 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Oct 28 00:28:17.358716 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Oct 28 00:28:17.358723 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Oct 28 00:28:17.358730 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Oct 28 00:28:17.358736 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 28 00:28:17.358743 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 28 00:28:17.358749 systemd[1]: Reached target slices.target - Slice Units. Oct 28 00:28:17.358757 systemd[1]: Reached target swap.target - Swaps. Oct 28 00:28:17.358764 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Oct 28 00:28:17.358772 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Oct 28 00:28:17.358778 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Oct 28 00:28:17.358786 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 28 00:28:17.358797 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 28 00:28:17.358807 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 28 00:28:17.358817 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Oct 28 00:28:17.358828 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Oct 28 00:28:17.358837 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Oct 28 00:28:17.358844 systemd[1]: Mounting media.mount - External Media Directory... Oct 28 00:28:17.358852 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:28:17.358859 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Oct 28 00:28:17.358867 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Oct 28 00:28:17.358875 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Oct 28 00:28:17.358882 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Oct 28 00:28:17.358889 systemd[1]: Reached target machines.target - Containers. Oct 28 00:28:17.358895 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Oct 28 00:28:17.358903 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Oct 28 00:28:17.358909 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 28 00:28:17.358916 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Oct 28 00:28:17.358924 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 28 00:28:17.358932 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 28 00:28:17.358939 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 28 00:28:17.358945 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 28 00:28:17.358952 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 28 00:28:17.358959 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Oct 28 00:28:17.358966 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Oct 28 00:28:17.358973 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Oct 28 00:28:17.358980 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Oct 28 00:28:17.358988 systemd[1]: Stopped systemd-fsck-usr.service. Oct 28 00:28:17.358995 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 28 00:28:17.359003 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 28 00:28:17.359010 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 28 00:28:17.359017 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 28 00:28:17.359024 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Oct 28 00:28:17.359031 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Oct 28 00:28:17.359037 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 28 00:28:17.359045 systemd[1]: verity-setup.service: Deactivated successfully. Oct 28 00:28:17.359052 systemd[1]: Stopped verity-setup.service. Oct 28 00:28:17.359059 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:28:17.359066 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Oct 28 00:28:17.359073 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Oct 28 00:28:17.359080 systemd[1]: Mounted media.mount - External Media Directory. Oct 28 00:28:17.359087 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Oct 28 00:28:17.359094 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Oct 28 00:28:17.359102 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Oct 28 00:28:17.359109 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Oct 28 00:28:17.359116 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 28 00:28:17.359122 systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 28 00:28:17.359129 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Oct 28 00:28:17.359136 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 28 00:28:17.359142 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 28 00:28:17.359150 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 28 00:28:17.359157 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 28 00:28:17.359165 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 28 00:28:17.359172 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 28 00:28:17.359179 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Oct 28 00:28:17.359186 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 28 00:28:17.359193 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Oct 28 00:28:17.359200 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Oct 28 00:28:17.359206 kernel: loop: module loaded Oct 28 00:28:17.359213 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 28 00:28:17.359224 kernel: fuse: init (API version 7.41) Oct 28 00:28:17.359230 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Oct 28 00:28:17.359251 systemd-journald[1216]: Collecting audit messages is disabled. Oct 28 00:28:17.360103 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Oct 28 00:28:17.360115 systemd-journald[1216]: Journal started Oct 28 00:28:17.360132 systemd-journald[1216]: Runtime Journal (/run/log/journal/37aec0ceb40141498a3ae053499a7396) is 4.8M, max 38.5M, 33.7M free. Oct 28 00:28:17.164273 systemd[1]: Queued start job for default target multi-user.target. Oct 28 00:28:17.176357 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Oct 28 00:28:17.176652 systemd[1]: systemd-journald.service: Deactivated successfully. Oct 28 00:28:17.360632 jq[1186]: true Oct 28 00:28:17.361166 jq[1231]: true Oct 28 00:28:17.365370 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 28 00:28:17.369406 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Oct 28 00:28:17.369440 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 28 00:28:17.372374 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Oct 28 00:28:17.378102 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 28 00:28:17.399021 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Oct 28 00:28:17.399059 systemd[1]: Starting systemd-sysusers.service - Create System Users... Oct 28 00:28:17.399071 systemd[1]: Started systemd-journald.service - Journal Service. Oct 28 00:28:17.398902 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 28 00:28:17.399224 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 28 00:28:17.399842 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 28 00:28:17.399956 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 28 00:28:17.400206 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Oct 28 00:28:17.402368 kernel: ACPI: bus type drm_connector registered Oct 28 00:28:17.405341 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 28 00:28:17.406581 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 28 00:28:17.406777 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Oct 28 00:28:17.407022 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Oct 28 00:28:17.416818 kernel: loop0: detected capacity change from 0 to 128016 Oct 28 00:28:17.416227 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Oct 28 00:28:17.418451 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Oct 28 00:28:17.419725 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Oct 28 00:28:17.421552 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Oct 28 00:28:17.421690 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 28 00:28:17.429117 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Oct 28 00:28:17.445857 systemd-journald[1216]: Time spent on flushing to /var/log/journal/37aec0ceb40141498a3ae053499a7396 is 66.874ms for 1761 entries. Oct 28 00:28:17.445857 systemd-journald[1216]: System Journal (/var/log/journal/37aec0ceb40141498a3ae053499a7396) is 8M, max 584.8M, 576.8M free. Oct 28 00:28:17.518446 systemd-journald[1216]: Received client request to flush runtime journal. Oct 28 00:28:17.518481 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Oct 28 00:28:17.470867 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 28 00:28:17.452192 ignition[1241]: Ignition 2.22.0 Oct 28 00:28:17.481022 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Oct 28 00:28:17.453936 ignition[1241]: deleting config from guestinfo properties Oct 28 00:28:17.495392 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Oct 28 00:28:17.492398 ignition[1241]: Successfully deleted config Oct 28 00:28:17.519747 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Oct 28 00:28:17.544563 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 28 00:28:17.588417 kernel: loop1: detected capacity change from 0 to 2960 Oct 28 00:28:17.588310 systemd[1]: Finished systemd-sysusers.service - Create System Users. Oct 28 00:28:17.589434 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 28 00:28:17.671067 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Oct 28 00:28:17.671079 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Oct 28 00:28:17.673625 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 28 00:28:17.831386 kernel: loop2: detected capacity change from 0 to 229808 Oct 28 00:28:18.177585 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Oct 28 00:28:18.241381 kernel: loop3: detected capacity change from 0 to 110984 Oct 28 00:28:18.347998 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Oct 28 00:28:18.349300 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 28 00:28:18.369137 systemd-udevd[1291]: Using default interface naming scheme 'v255'. Oct 28 00:28:18.438379 kernel: loop4: detected capacity change from 0 to 128016 Oct 28 00:28:18.446799 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 28 00:28:18.449393 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 28 00:28:18.460905 kernel: loop5: detected capacity change from 0 to 2960 Oct 28 00:28:18.471697 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Oct 28 00:28:18.486374 kernel: loop6: detected capacity change from 0 to 229808 Oct 28 00:28:18.509380 kernel: loop7: detected capacity change from 0 to 110984 Oct 28 00:28:18.549327 systemd[1]: Started systemd-userdbd.service - User Database Manager. Oct 28 00:28:18.559407 (sd-merge)[1293]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Oct 28 00:28:18.560270 (sd-merge)[1293]: Merged extensions into '/usr'. Oct 28 00:28:18.574407 systemd[1]: Reload requested from client PID 1238 ('systemd-sysext') (unit systemd-sysext.service)... Oct 28 00:28:18.574418 systemd[1]: Reloading... Oct 28 00:28:18.628546 systemd-networkd[1294]: lo: Link UP Oct 28 00:28:18.628553 systemd-networkd[1294]: lo: Gained carrier Oct 28 00:28:18.629057 systemd-networkd[1294]: Enumeration completed Oct 28 00:28:18.639384 zram_generator::config[1355]: No configuration found. Oct 28 00:28:18.690170 systemd-networkd[1294]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Oct 28 00:28:18.697520 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Oct 28 00:28:18.697661 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Oct 28 00:28:18.697745 kernel: mousedev: PS/2 mouse device common for all mice Oct 28 00:28:18.705841 systemd-networkd[1294]: ens192: Link UP Oct 28 00:28:18.706103 systemd-networkd[1294]: ens192: Gained carrier Oct 28 00:28:18.720373 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Oct 28 00:28:18.726514 kernel: ACPI: button: Power Button [PWRF] Oct 28 00:28:18.817230 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:28:18.823904 ldconfig[1234]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Oct 28 00:28:18.860373 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Oct 28 00:28:18.891248 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Oct 28 00:28:18.891470 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Oct 28 00:28:18.891793 systemd[1]: Reloading finished in 317 ms. Oct 28 00:28:18.902884 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 28 00:28:18.903214 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Oct 28 00:28:18.903542 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Oct 28 00:28:18.930499 systemd[1]: Starting ensure-sysext.service... Oct 28 00:28:18.931231 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Oct 28 00:28:18.935401 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Oct 28 00:28:18.938514 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Oct 28 00:28:18.941615 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 28 00:28:18.951521 systemd[1]: Reload requested from client PID 1433 ('systemctl') (unit ensure-sysext.service)... Oct 28 00:28:18.951530 systemd[1]: Reloading... Oct 28 00:28:18.993446 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Oct 28 00:28:18.993467 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Oct 28 00:28:18.993637 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Oct 28 00:28:18.993800 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Oct 28 00:28:18.994299 systemd-tmpfiles[1437]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Oct 28 00:28:18.998534 systemd-tmpfiles[1437]: ACLs are not supported, ignoring. Oct 28 00:28:18.998573 systemd-tmpfiles[1437]: ACLs are not supported, ignoring. Oct 28 00:28:18.998693 (udev-worker)[1300]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Oct 28 00:28:19.012390 zram_generator::config[1466]: No configuration found. Oct 28 00:28:19.017567 systemd-tmpfiles[1437]: Detected autofs mount point /boot during canonicalization of boot. Oct 28 00:28:19.017573 systemd-tmpfiles[1437]: Skipping /boot Oct 28 00:28:19.029647 systemd-tmpfiles[1437]: Detected autofs mount point /boot during canonicalization of boot. Oct 28 00:28:19.029654 systemd-tmpfiles[1437]: Skipping /boot Oct 28 00:28:19.108662 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:28:19.168430 systemd[1]: Reloading finished in 216 ms. Oct 28 00:28:19.202228 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Oct 28 00:28:19.202569 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Oct 28 00:28:19.202842 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 28 00:28:19.215280 systemd[1]: Finished ensure-sysext.service. Oct 28 00:28:19.216518 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:28:19.217459 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 28 00:28:19.228809 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Oct 28 00:28:19.229607 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 28 00:28:19.232109 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 28 00:28:19.238347 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 28 00:28:19.239712 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 28 00:28:19.240147 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 28 00:28:19.240173 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 28 00:28:19.243475 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Oct 28 00:28:19.250613 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 28 00:28:19.252520 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Oct 28 00:28:19.254264 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Oct 28 00:28:19.257607 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:28:19.257731 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:28:19.258208 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 28 00:28:19.259435 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 28 00:28:19.259729 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 28 00:28:19.259857 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 28 00:28:19.260076 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 28 00:28:19.260178 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 28 00:28:19.260484 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 28 00:28:19.260587 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 28 00:28:19.262803 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 28 00:28:19.262847 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 28 00:28:19.277454 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Oct 28 00:28:19.285129 augenrules[1568]: No rules Oct 28 00:28:19.287615 systemd[1]: audit-rules.service: Deactivated successfully. Oct 28 00:28:19.287767 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 28 00:28:19.290608 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Oct 28 00:28:19.292326 systemd[1]: Starting systemd-update-done.service - Update is Completed... Oct 28 00:28:19.314518 systemd[1]: Finished systemd-update-done.service - Update is Completed. Oct 28 00:28:19.322821 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Oct 28 00:28:19.322983 systemd[1]: Reached target time-set.target - System Time Set. Oct 28 00:28:19.330616 systemd-resolved[1543]: Positive Trust Anchors: Oct 28 00:28:19.330625 systemd-resolved[1543]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 28 00:28:19.330649 systemd-resolved[1543]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 28 00:28:19.333503 systemd-resolved[1543]: Defaulting to hostname 'linux'. Oct 28 00:28:19.335897 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Oct 28 00:28:19.339970 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 28 00:28:19.340639 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:28:19.341050 systemd[1]: Reached target network.target - Network. Oct 28 00:28:19.341205 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 28 00:28:19.341350 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 28 00:28:19.341418 systemd[1]: Reached target sysinit.target - System Initialization. Oct 28 00:28:19.341587 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Oct 28 00:28:19.341724 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Oct 28 00:28:19.341845 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Oct 28 00:28:19.342039 systemd[1]: Started logrotate.timer - Daily rotation of log files. Oct 28 00:28:19.342195 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Oct 28 00:28:19.342337 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Oct 28 00:28:19.342500 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Oct 28 00:28:19.342520 systemd[1]: Reached target paths.target - Path Units. Oct 28 00:28:19.342614 systemd[1]: Reached target timers.target - Timer Units. Oct 28 00:28:19.343402 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Oct 28 00:28:19.344348 systemd[1]: Starting docker.socket - Docker Socket for the API... Oct 28 00:28:19.345846 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Oct 28 00:28:19.346040 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Oct 28 00:28:19.346163 systemd[1]: Reached target ssh-access.target - SSH Access Available. Oct 28 00:28:19.349725 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Oct 28 00:28:19.349981 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Oct 28 00:28:19.350525 systemd[1]: Listening on docker.socket - Docker Socket for the API. Oct 28 00:28:19.351013 systemd[1]: Reached target sockets.target - Socket Units. Oct 28 00:28:19.351115 systemd[1]: Reached target basic.target - Basic System. Oct 28 00:28:19.351241 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Oct 28 00:28:19.351259 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Oct 28 00:28:19.351863 systemd[1]: Starting containerd.service - containerd container runtime... Oct 28 00:28:19.354461 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Oct 28 00:28:19.355650 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Oct 28 00:28:19.357086 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Oct 28 00:28:19.360466 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Oct 28 00:28:19.360596 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Oct 28 00:28:19.363390 jq[1587]: false Oct 28 00:28:19.364395 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Oct 28 00:28:19.365896 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Oct 28 00:28:19.366818 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Oct 28 00:28:19.368028 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Oct 28 00:28:19.369670 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Oct 28 00:28:19.376046 systemd[1]: Starting systemd-logind.service - User Login Management... Oct 28 00:28:19.376665 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Oct 28 00:28:19.377126 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Oct 28 00:28:19.382331 extend-filesystems[1588]: Found /dev/sda6 Oct 28 00:28:19.382326 systemd[1]: Starting update-engine.service - Update Engine... Oct 28 00:28:19.385605 google_oslogin_nss_cache[1589]: oslogin_cache_refresh[1589]: Refreshing passwd entry cache Oct 28 00:28:19.385996 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Oct 28 00:28:19.386384 oslogin_cache_refresh[1589]: Refreshing passwd entry cache Oct 28 00:28:19.387808 extend-filesystems[1588]: Found /dev/sda9 Oct 28 00:28:19.388714 extend-filesystems[1588]: Checking size of /dev/sda9 Oct 28 00:28:19.388808 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Oct 28 00:28:19.395668 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Oct 28 00:28:19.395945 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Oct 28 00:28:19.396073 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Oct 28 00:28:19.396823 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Oct 28 00:28:19.396945 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Oct 28 00:28:19.398102 google_oslogin_nss_cache[1589]: oslogin_cache_refresh[1589]: Failure getting users, quitting Oct 28 00:28:19.398131 oslogin_cache_refresh[1589]: Failure getting users, quitting Oct 28 00:28:19.398169 google_oslogin_nss_cache[1589]: oslogin_cache_refresh[1589]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 28 00:28:19.398188 oslogin_cache_refresh[1589]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 28 00:28:19.398242 google_oslogin_nss_cache[1589]: oslogin_cache_refresh[1589]: Refreshing group entry cache Oct 28 00:28:19.398263 oslogin_cache_refresh[1589]: Refreshing group entry cache Oct 28 00:28:19.399425 extend-filesystems[1588]: Old size kept for /dev/sda9 Oct 28 00:28:19.401133 systemd[1]: extend-filesystems.service: Deactivated successfully. Oct 28 00:28:19.401280 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Oct 28 00:28:19.402465 google_oslogin_nss_cache[1589]: oslogin_cache_refresh[1589]: Failure getting groups, quitting Oct 28 00:28:19.402465 google_oslogin_nss_cache[1589]: oslogin_cache_refresh[1589]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 28 00:28:19.401954 oslogin_cache_refresh[1589]: Failure getting groups, quitting Oct 28 00:28:19.401961 oslogin_cache_refresh[1589]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 28 00:28:19.404900 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Oct 28 00:28:19.411719 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Oct 28 00:29:57.001659 systemd-timesyncd[1553]: Contacted time server 23.150.41.123:123 (0.flatcar.pool.ntp.org). Oct 28 00:29:57.001686 systemd-timesyncd[1553]: Initial clock synchronization to Tue 2025-10-28 00:29:57.001607 UTC. Oct 28 00:29:57.007874 systemd-resolved[1543]: Clock change detected. Flushing caches. Oct 28 00:29:57.008942 systemd[1]: motdgen.service: Deactivated successfully. Oct 28 00:29:57.009094 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Oct 28 00:29:57.014793 tar[1610]: linux-amd64/LICENSE Oct 28 00:29:57.014793 tar[1610]: linux-amd64/helm Oct 28 00:29:57.016725 update_engine[1598]: I20251028 00:29:57.016678 1598 main.cc:92] Flatcar Update Engine starting Oct 28 00:29:57.018451 jq[1600]: true Oct 28 00:29:57.034069 (ntainerd)[1625]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Oct 28 00:29:57.043746 jq[1633]: true Oct 28 00:29:57.047652 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Oct 28 00:29:57.049955 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Oct 28 00:29:57.085528 dbus-daemon[1585]: [system] SELinux support is enabled Oct 28 00:29:57.085877 systemd[1]: Started dbus.service - D-Bus System Message Bus. Oct 28 00:29:57.089359 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Oct 28 00:29:57.089379 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Oct 28 00:29:57.089580 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Oct 28 00:29:57.089593 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Oct 28 00:29:57.094440 systemd[1]: Started update-engine.service - Update Engine. Oct 28 00:29:57.095973 update_engine[1598]: I20251028 00:29:57.094627 1598 update_check_scheduler.cc:74] Next update check in 11m37s Oct 28 00:29:57.122694 systemd[1]: Started locksmithd.service - Cluster reboot manager. Oct 28 00:29:57.125973 unknown[1635]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Oct 28 00:29:57.126242 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Oct 28 00:29:57.135837 unknown[1635]: Core dump limit set to -1 Oct 28 00:29:57.174683 bash[1657]: Updated "/home/core/.ssh/authorized_keys" Oct 28 00:29:57.173698 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Oct 28 00:29:57.174136 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Oct 28 00:29:57.175273 systemd-logind[1597]: Watching system buttons on /dev/input/event2 (Power Button) Oct 28 00:29:57.176039 systemd-logind[1597]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Oct 28 00:29:57.182729 systemd-logind[1597]: New seat seat0. Oct 28 00:29:57.183152 systemd[1]: Started systemd-logind.service - User Login Management. Oct 28 00:29:57.322740 locksmithd[1644]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Oct 28 00:29:57.340563 containerd[1625]: time="2025-10-28T00:29:57Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Oct 28 00:29:57.343592 containerd[1625]: time="2025-10-28T00:29:57.343464136Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Oct 28 00:29:57.357758 containerd[1625]: time="2025-10-28T00:29:57.357723005Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.103µs" Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.357832918Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.357848707Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.357948289Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.357957486Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.357973034Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.358006992Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.358014274Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.358164598Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.358174072Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.358180526Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.358184924Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358393 containerd[1625]: time="2025-10-28T00:29:57.358229766Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358559 containerd[1625]: time="2025-10-28T00:29:57.358353789Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358559 containerd[1625]: time="2025-10-28T00:29:57.358369991Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 28 00:29:57.358559 containerd[1625]: time="2025-10-28T00:29:57.358376215Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Oct 28 00:29:57.358644 containerd[1625]: time="2025-10-28T00:29:57.358634556Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Oct 28 00:29:57.358852 containerd[1625]: time="2025-10-28T00:29:57.358842626Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Oct 28 00:29:57.358912 containerd[1625]: time="2025-10-28T00:29:57.358904451Z" level=info msg="metadata content store policy set" policy=shared Oct 28 00:29:57.362917 containerd[1625]: time="2025-10-28T00:29:57.362884631Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.362967592Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.362986289Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.362999684Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363007890Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363034285Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363045184Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363054737Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363060874Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363066343Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363071192Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363079067Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363139344Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363151362Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Oct 28 00:29:57.363194 containerd[1625]: time="2025-10-28T00:29:57.363162861Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Oct 28 00:29:57.363397 containerd[1625]: time="2025-10-28T00:29:57.363169005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Oct 28 00:29:57.363397 containerd[1625]: time="2025-10-28T00:29:57.363175474Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Oct 28 00:29:57.363428 containerd[1625]: time="2025-10-28T00:29:57.363181266Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Oct 28 00:29:57.363463 containerd[1625]: time="2025-10-28T00:29:57.363454672Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Oct 28 00:29:57.363586 containerd[1625]: time="2025-10-28T00:29:57.363499072Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Oct 28 00:29:57.363586 containerd[1625]: time="2025-10-28T00:29:57.363509506Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Oct 28 00:29:57.363586 containerd[1625]: time="2025-10-28T00:29:57.363515782Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Oct 28 00:29:57.363586 containerd[1625]: time="2025-10-28T00:29:57.363521654Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Oct 28 00:29:57.363586 containerd[1625]: time="2025-10-28T00:29:57.363566421Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Oct 28 00:29:57.367584 containerd[1625]: time="2025-10-28T00:29:57.365597830Z" level=info msg="Start snapshots syncer" Oct 28 00:29:57.367584 containerd[1625]: time="2025-10-28T00:29:57.365629882Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Oct 28 00:29:57.367584 containerd[1625]: time="2025-10-28T00:29:57.365826255Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.365859476Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366724340Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366829336Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366843939Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366851364Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366863447Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366871185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366877518Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366883573Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366898047Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366904549Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366912377Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366937463Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 28 00:29:57.367719 containerd[1625]: time="2025-10-28T00:29:57.366951339Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.366957023Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.366962506Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.366966962Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.366972310Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.366977976Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.367015518Z" level=info msg="runtime interface created" Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.367020242Z" level=info msg="created NRI interface" Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.367025543Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.367032493Z" level=info msg="Connect containerd service" Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.367048159Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Oct 28 00:29:57.367895 containerd[1625]: time="2025-10-28T00:29:57.367520289Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Oct 28 00:29:57.393111 sshd_keygen[1613]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Oct 28 00:29:57.420936 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Oct 28 00:29:57.423315 systemd[1]: Starting issuegen.service - Generate /run/issue... Oct 28 00:29:57.434503 systemd[1]: issuegen.service: Deactivated successfully. Oct 28 00:29:57.435614 systemd[1]: Finished issuegen.service - Generate /run/issue. Oct 28 00:29:57.441091 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Oct 28 00:29:57.489438 tar[1610]: linux-amd64/README.md Oct 28 00:29:57.496727 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Oct 28 00:29:57.497946 systemd[1]: Started getty@tty1.service - Getty on tty1. Oct 28 00:29:57.502519 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Oct 28 00:29:57.502745 systemd[1]: Reached target getty.target - Login Prompts. Oct 28 00:29:57.504246 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Oct 28 00:29:57.533301 containerd[1625]: time="2025-10-28T00:29:57.533278186Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Oct 28 00:29:57.533835 containerd[1625]: time="2025-10-28T00:29:57.533494132Z" level=info msg="Start subscribing containerd event" Oct 28 00:29:57.533955 containerd[1625]: time="2025-10-28T00:29:57.533929539Z" level=info msg="Start recovering state" Oct 28 00:29:57.534000 containerd[1625]: time="2025-10-28T00:29:57.533988919Z" level=info msg="Start event monitor" Oct 28 00:29:57.534017 containerd[1625]: time="2025-10-28T00:29:57.534001001Z" level=info msg="Start cni network conf syncer for default" Oct 28 00:29:57.534017 containerd[1625]: time="2025-10-28T00:29:57.534006493Z" level=info msg="Start streaming server" Oct 28 00:29:57.534017 containerd[1625]: time="2025-10-28T00:29:57.534012060Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Oct 28 00:29:57.534069 containerd[1625]: time="2025-10-28T00:29:57.534016040Z" level=info msg="runtime interface starting up..." Oct 28 00:29:57.534069 containerd[1625]: time="2025-10-28T00:29:57.534025077Z" level=info msg="starting plugins..." Oct 28 00:29:57.534069 containerd[1625]: time="2025-10-28T00:29:57.534035059Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Oct 28 00:29:57.534106 containerd[1625]: time="2025-10-28T00:29:57.533908764Z" level=info msg=serving... address=/run/containerd/containerd.sock Oct 28 00:29:57.534190 systemd[1]: Started containerd.service - containerd container runtime. Oct 28 00:29:57.535635 containerd[1625]: time="2025-10-28T00:29:57.535622827Z" level=info msg="containerd successfully booted in 0.195289s" Oct 28 00:29:57.542714 systemd-networkd[1294]: ens192: Gained IPv6LL Oct 28 00:29:57.544198 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Oct 28 00:29:57.544739 systemd[1]: Reached target network-online.target - Network is Online. Oct 28 00:29:57.545764 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Oct 28 00:29:57.552637 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:29:57.553730 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Oct 28 00:29:57.597445 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Oct 28 00:29:57.625659 systemd[1]: coreos-metadata.service: Deactivated successfully. Oct 28 00:29:57.625823 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Oct 28 00:29:57.626266 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Oct 28 00:29:58.938474 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:29:58.939082 systemd[1]: Reached target multi-user.target - Multi-User System. Oct 28 00:29:58.939743 systemd[1]: Startup finished in 2.591s (kernel) + 5.204s (initrd) + 4.605s (userspace) = 12.401s. Oct 28 00:29:58.945921 (kubelet)[1784]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:29:58.972660 login[1747]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Oct 28 00:29:58.972811 login[1746]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 28 00:29:58.978028 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Oct 28 00:29:58.979324 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Oct 28 00:29:58.987377 systemd-logind[1597]: New session 2 of user core. Oct 28 00:29:59.001542 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Oct 28 00:29:59.003660 systemd[1]: Starting user@500.service - User Manager for UID 500... Oct 28 00:29:59.022538 (systemd)[1791]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Oct 28 00:29:59.024164 systemd-logind[1597]: New session c1 of user core. Oct 28 00:29:59.125026 systemd[1791]: Queued start job for default target default.target. Oct 28 00:29:59.141541 systemd[1791]: Created slice app.slice - User Application Slice. Oct 28 00:29:59.141561 systemd[1791]: Reached target paths.target - Paths. Oct 28 00:29:59.141608 systemd[1791]: Reached target timers.target - Timers. Oct 28 00:29:59.142344 systemd[1791]: Starting dbus.socket - D-Bus User Message Bus Socket... Oct 28 00:29:59.149477 systemd[1791]: Listening on dbus.socket - D-Bus User Message Bus Socket. Oct 28 00:29:59.149518 systemd[1791]: Reached target sockets.target - Sockets. Oct 28 00:29:59.149546 systemd[1791]: Reached target basic.target - Basic System. Oct 28 00:29:59.149568 systemd[1791]: Reached target default.target - Main User Target. Oct 28 00:29:59.149602 systemd[1791]: Startup finished in 121ms. Oct 28 00:29:59.149751 systemd[1]: Started user@500.service - User Manager for UID 500. Oct 28 00:29:59.156673 systemd[1]: Started session-2.scope - Session 2 of User core. Oct 28 00:29:59.770811 kubelet[1784]: E1028 00:29:59.770776 1784 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:29:59.772408 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:29:59.772559 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:29:59.772886 systemd[1]: kubelet.service: Consumed 641ms CPU time, 268.9M memory peak. Oct 28 00:29:59.974161 login[1747]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 28 00:29:59.977605 systemd-logind[1597]: New session 1 of user core. Oct 28 00:29:59.983663 systemd[1]: Started session-1.scope - Session 1 of User core. Oct 28 00:30:10.022866 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Oct 28 00:30:10.023899 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:10.379521 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:10.391807 (kubelet)[1833]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:30:10.451060 kubelet[1833]: E1028 00:30:10.451016 1833 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:30:10.454003 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:30:10.454160 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:30:10.454452 systemd[1]: kubelet.service: Consumed 112ms CPU time, 110.7M memory peak. Oct 28 00:30:20.704598 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Oct 28 00:30:20.705840 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:21.064704 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:21.070909 (kubelet)[1847]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:30:21.110381 kubelet[1847]: E1028 00:30:21.110339 1847 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:30:21.112197 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:30:21.112369 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:30:21.112784 systemd[1]: kubelet.service: Consumed 105ms CPU time, 107.8M memory peak. Oct 28 00:30:27.295860 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Oct 28 00:30:27.296978 systemd[1]: Started sshd@0-139.178.70.100:22-139.178.89.65:42548.service - OpenSSH per-connection server daemon (139.178.89.65:42548). Oct 28 00:30:27.410009 sshd[1854]: Accepted publickey for core from 139.178.89.65 port 42548 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:30:27.410960 sshd-session[1854]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:30:27.414251 systemd-logind[1597]: New session 3 of user core. Oct 28 00:30:27.421722 systemd[1]: Started session-3.scope - Session 3 of User core. Oct 28 00:30:27.473766 systemd[1]: Started sshd@1-139.178.70.100:22-139.178.89.65:42556.service - OpenSSH per-connection server daemon (139.178.89.65:42556). Oct 28 00:30:27.517816 sshd[1860]: Accepted publickey for core from 139.178.89.65 port 42556 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:30:27.518751 sshd-session[1860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:30:27.521502 systemd-logind[1597]: New session 4 of user core. Oct 28 00:30:27.530663 systemd[1]: Started session-4.scope - Session 4 of User core. Oct 28 00:30:27.579626 sshd[1863]: Connection closed by 139.178.89.65 port 42556 Oct 28 00:30:27.580084 sshd-session[1860]: pam_unix(sshd:session): session closed for user core Oct 28 00:30:27.590158 systemd[1]: sshd@1-139.178.70.100:22-139.178.89.65:42556.service: Deactivated successfully. Oct 28 00:30:27.591301 systemd[1]: session-4.scope: Deactivated successfully. Oct 28 00:30:27.591880 systemd-logind[1597]: Session 4 logged out. Waiting for processes to exit. Oct 28 00:30:27.593197 systemd[1]: Started sshd@2-139.178.70.100:22-139.178.89.65:42560.service - OpenSSH per-connection server daemon (139.178.89.65:42560). Oct 28 00:30:27.594797 systemd-logind[1597]: Removed session 4. Oct 28 00:30:27.634229 sshd[1869]: Accepted publickey for core from 139.178.89.65 port 42560 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:30:27.634971 sshd-session[1869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:30:27.637448 systemd-logind[1597]: New session 5 of user core. Oct 28 00:30:27.644666 systemd[1]: Started session-5.scope - Session 5 of User core. Oct 28 00:30:27.690199 sshd[1872]: Connection closed by 139.178.89.65 port 42560 Oct 28 00:30:27.690540 sshd-session[1869]: pam_unix(sshd:session): session closed for user core Oct 28 00:30:27.702689 systemd[1]: sshd@2-139.178.70.100:22-139.178.89.65:42560.service: Deactivated successfully. Oct 28 00:30:27.703646 systemd[1]: session-5.scope: Deactivated successfully. Oct 28 00:30:27.704387 systemd-logind[1597]: Session 5 logged out. Waiting for processes to exit. Oct 28 00:30:27.705428 systemd[1]: Started sshd@3-139.178.70.100:22-139.178.89.65:42572.service - OpenSSH per-connection server daemon (139.178.89.65:42572). Oct 28 00:30:27.707194 systemd-logind[1597]: Removed session 5. Oct 28 00:30:27.752480 sshd[1878]: Accepted publickey for core from 139.178.89.65 port 42572 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:30:27.753440 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:30:27.756550 systemd-logind[1597]: New session 6 of user core. Oct 28 00:30:27.768897 systemd[1]: Started session-6.scope - Session 6 of User core. Oct 28 00:30:27.817400 sshd[1881]: Connection closed by 139.178.89.65 port 42572 Oct 28 00:30:27.818300 sshd-session[1878]: pam_unix(sshd:session): session closed for user core Oct 28 00:30:27.824224 systemd[1]: sshd@3-139.178.70.100:22-139.178.89.65:42572.service: Deactivated successfully. Oct 28 00:30:27.825483 systemd[1]: session-6.scope: Deactivated successfully. Oct 28 00:30:27.827719 systemd[1]: Started sshd@4-139.178.70.100:22-139.178.89.65:42582.service - OpenSSH per-connection server daemon (139.178.89.65:42582). Oct 28 00:30:27.828278 systemd-logind[1597]: Session 6 logged out. Waiting for processes to exit. Oct 28 00:30:27.829051 systemd-logind[1597]: Removed session 6. Oct 28 00:30:27.866017 sshd[1887]: Accepted publickey for core from 139.178.89.65 port 42582 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:30:27.866786 sshd-session[1887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:30:27.869471 systemd-logind[1597]: New session 7 of user core. Oct 28 00:30:27.878794 systemd[1]: Started session-7.scope - Session 7 of User core. Oct 28 00:30:28.008596 sudo[1891]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Oct 28 00:30:28.008820 sudo[1891]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:30:28.021379 sudo[1891]: pam_unix(sudo:session): session closed for user root Oct 28 00:30:28.022538 sshd[1890]: Connection closed by 139.178.89.65 port 42582 Oct 28 00:30:28.023101 sshd-session[1887]: pam_unix(sshd:session): session closed for user core Oct 28 00:30:28.033998 systemd[1]: sshd@4-139.178.70.100:22-139.178.89.65:42582.service: Deactivated successfully. Oct 28 00:30:28.035362 systemd[1]: session-7.scope: Deactivated successfully. Oct 28 00:30:28.036090 systemd-logind[1597]: Session 7 logged out. Waiting for processes to exit. Oct 28 00:30:28.037906 systemd[1]: Started sshd@5-139.178.70.100:22-139.178.89.65:42588.service - OpenSSH per-connection server daemon (139.178.89.65:42588). Oct 28 00:30:28.039164 systemd-logind[1597]: Removed session 7. Oct 28 00:30:28.082799 sshd[1897]: Accepted publickey for core from 139.178.89.65 port 42588 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:30:28.083607 sshd-session[1897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:30:28.086471 systemd-logind[1597]: New session 8 of user core. Oct 28 00:30:28.096698 systemd[1]: Started session-8.scope - Session 8 of User core. Oct 28 00:30:28.147916 sudo[1902]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Oct 28 00:30:28.148167 sudo[1902]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:30:28.155632 sudo[1902]: pam_unix(sudo:session): session closed for user root Oct 28 00:30:28.159049 sudo[1901]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Oct 28 00:30:28.159389 sudo[1901]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:30:28.166462 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 28 00:30:28.189981 augenrules[1924]: No rules Oct 28 00:30:28.190672 systemd[1]: audit-rules.service: Deactivated successfully. Oct 28 00:30:28.190966 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 28 00:30:28.191954 sudo[1901]: pam_unix(sudo:session): session closed for user root Oct 28 00:30:28.192834 sshd[1900]: Connection closed by 139.178.89.65 port 42588 Oct 28 00:30:28.193799 sshd-session[1897]: pam_unix(sshd:session): session closed for user core Oct 28 00:30:28.197868 systemd[1]: sshd@5-139.178.70.100:22-139.178.89.65:42588.service: Deactivated successfully. Oct 28 00:30:28.198982 systemd[1]: session-8.scope: Deactivated successfully. Oct 28 00:30:28.199601 systemd-logind[1597]: Session 8 logged out. Waiting for processes to exit. Oct 28 00:30:28.201119 systemd-logind[1597]: Removed session 8. Oct 28 00:30:28.202327 systemd[1]: Started sshd@6-139.178.70.100:22-139.178.89.65:42600.service - OpenSSH per-connection server daemon (139.178.89.65:42600). Oct 28 00:30:28.243109 sshd[1933]: Accepted publickey for core from 139.178.89.65 port 42600 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:30:28.243991 sshd-session[1933]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:30:28.247017 systemd-logind[1597]: New session 9 of user core. Oct 28 00:30:28.256723 systemd[1]: Started session-9.scope - Session 9 of User core. Oct 28 00:30:28.305314 sudo[1937]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Oct 28 00:30:28.305705 sudo[1937]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:30:28.774102 systemd[1]: Starting docker.service - Docker Application Container Engine... Oct 28 00:30:28.782926 (dockerd)[1955]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Oct 28 00:30:29.119912 dockerd[1955]: time="2025-10-28T00:30:29.119708700Z" level=info msg="Starting up" Oct 28 00:30:29.120505 dockerd[1955]: time="2025-10-28T00:30:29.120494821Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Oct 28 00:30:29.126187 dockerd[1955]: time="2025-10-28T00:30:29.126147261Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Oct 28 00:30:29.170441 dockerd[1955]: time="2025-10-28T00:30:29.170415176Z" level=info msg="Loading containers: start." Oct 28 00:30:29.179591 kernel: Initializing XFRM netlink socket Oct 28 00:30:29.409929 systemd-networkd[1294]: docker0: Link UP Oct 28 00:30:29.411842 dockerd[1955]: time="2025-10-28T00:30:29.411816043Z" level=info msg="Loading containers: done." Oct 28 00:30:29.420436 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2435095509-merged.mount: Deactivated successfully. Oct 28 00:30:29.425406 dockerd[1955]: time="2025-10-28T00:30:29.425368510Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Oct 28 00:30:29.425456 dockerd[1955]: time="2025-10-28T00:30:29.425438716Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Oct 28 00:30:29.425514 dockerd[1955]: time="2025-10-28T00:30:29.425497387Z" level=info msg="Initializing buildkit" Oct 28 00:30:29.474734 dockerd[1955]: time="2025-10-28T00:30:29.474653620Z" level=info msg="Completed buildkit initialization" Oct 28 00:30:29.479847 dockerd[1955]: time="2025-10-28T00:30:29.479819729Z" level=info msg="Daemon has completed initialization" Oct 28 00:30:29.479997 dockerd[1955]: time="2025-10-28T00:30:29.479908815Z" level=info msg="API listen on /run/docker.sock" Oct 28 00:30:29.480131 systemd[1]: Started docker.service - Docker Application Container Engine. Oct 28 00:30:30.569959 containerd[1625]: time="2025-10-28T00:30:30.569930981Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Oct 28 00:30:31.289217 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Oct 28 00:30:31.291795 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:31.300872 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2889337035.mount: Deactivated successfully. Oct 28 00:30:31.576608 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:31.585792 (kubelet)[2190]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:30:31.611141 kubelet[2190]: E1028 00:30:31.611104 2190 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:30:31.612726 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:30:31.612865 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:30:31.613235 systemd[1]: kubelet.service: Consumed 101ms CPU time, 108M memory peak. Oct 28 00:30:32.931379 containerd[1625]: time="2025-10-28T00:30:32.930891276Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:32.932015 containerd[1625]: time="2025-10-28T00:30:32.932004639Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=30114893" Oct 28 00:30:32.932315 containerd[1625]: time="2025-10-28T00:30:32.932300238Z" level=info msg="ImageCreate event name:\"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:32.934441 containerd[1625]: time="2025-10-28T00:30:32.934425855Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:32.935218 containerd[1625]: time="2025-10-28T00:30:32.934853979Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"30111492\" in 2.364896813s" Oct 28 00:30:32.935466 containerd[1625]: time="2025-10-28T00:30:32.935452871Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\"" Oct 28 00:30:32.935838 containerd[1625]: time="2025-10-28T00:30:32.935824551Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Oct 28 00:30:34.655469 containerd[1625]: time="2025-10-28T00:30:34.655417990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:34.666852 containerd[1625]: time="2025-10-28T00:30:34.666804693Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=26020844" Oct 28 00:30:34.695268 containerd[1625]: time="2025-10-28T00:30:34.695202823Z" level=info msg="ImageCreate event name:\"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:34.697884 containerd[1625]: time="2025-10-28T00:30:34.697841935Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:34.698402 containerd[1625]: time="2025-10-28T00:30:34.698341118Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"27681301\" in 1.762417251s" Oct 28 00:30:34.698402 containerd[1625]: time="2025-10-28T00:30:34.698362442Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\"" Oct 28 00:30:34.698851 containerd[1625]: time="2025-10-28T00:30:34.698735506Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Oct 28 00:30:36.344597 containerd[1625]: time="2025-10-28T00:30:36.344116552Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:36.345210 containerd[1625]: time="2025-10-28T00:30:36.345059283Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=20155568" Oct 28 00:30:36.345562 containerd[1625]: time="2025-10-28T00:30:36.345546024Z" level=info msg="ImageCreate event name:\"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:36.347123 containerd[1625]: time="2025-10-28T00:30:36.347107836Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:36.347835 containerd[1625]: time="2025-10-28T00:30:36.347817332Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"21816043\" in 1.648898638s" Oct 28 00:30:36.347835 containerd[1625]: time="2025-10-28T00:30:36.347835014Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\"" Oct 28 00:30:36.348212 containerd[1625]: time="2025-10-28T00:30:36.348188410Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Oct 28 00:30:37.786278 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount597796819.mount: Deactivated successfully. Oct 28 00:30:38.258930 containerd[1625]: time="2025-10-28T00:30:38.258897393Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:38.269627 containerd[1625]: time="2025-10-28T00:30:38.269596149Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=31929469" Oct 28 00:30:38.277653 containerd[1625]: time="2025-10-28T00:30:38.277629256Z" level=info msg="ImageCreate event name:\"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:38.286957 containerd[1625]: time="2025-10-28T00:30:38.286934044Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:38.287227 containerd[1625]: time="2025-10-28T00:30:38.287207404Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"31928488\" in 1.938999036s" Oct 28 00:30:38.287278 containerd[1625]: time="2025-10-28T00:30:38.287269441Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\"" Oct 28 00:30:38.287602 containerd[1625]: time="2025-10-28T00:30:38.287552129Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Oct 28 00:30:38.957587 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2281805033.mount: Deactivated successfully. Oct 28 00:30:39.960745 containerd[1625]: time="2025-10-28T00:30:39.960203778Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:39.967281 containerd[1625]: time="2025-10-28T00:30:39.967257949Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Oct 28 00:30:39.975117 containerd[1625]: time="2025-10-28T00:30:39.975101062Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:39.981141 containerd[1625]: time="2025-10-28T00:30:39.981128349Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:39.981645 containerd[1625]: time="2025-10-28T00:30:39.981627452Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.693960721s" Oct 28 00:30:39.981678 containerd[1625]: time="2025-10-28T00:30:39.981646529Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Oct 28 00:30:39.982083 containerd[1625]: time="2025-10-28T00:30:39.982066705Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Oct 28 00:30:40.682743 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1824270364.mount: Deactivated successfully. Oct 28 00:30:40.685818 containerd[1625]: time="2025-10-28T00:30:40.685719647Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:30:40.686053 containerd[1625]: time="2025-10-28T00:30:40.686041990Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Oct 28 00:30:40.687506 containerd[1625]: time="2025-10-28T00:30:40.687177224Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:30:40.687772 containerd[1625]: time="2025-10-28T00:30:40.687760116Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:30:40.688564 containerd[1625]: time="2025-10-28T00:30:40.688544197Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 706.402638ms" Oct 28 00:30:40.688643 containerd[1625]: time="2025-10-28T00:30:40.688634117Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Oct 28 00:30:40.689128 containerd[1625]: time="2025-10-28T00:30:40.689110732Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Oct 28 00:30:41.302605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2742410522.mount: Deactivated successfully. Oct 28 00:30:41.679722 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Oct 28 00:30:41.681342 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:41.834093 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:41.837252 (kubelet)[2369]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:30:41.993611 kubelet[2369]: E1028 00:30:41.992585 2369 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:30:41.993860 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:30:41.993944 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:30:41.994230 systemd[1]: kubelet.service: Consumed 100ms CPU time, 108.8M memory peak. Oct 28 00:30:42.552265 update_engine[1598]: I20251028 00:30:42.551955 1598 update_attempter.cc:509] Updating boot flags... Oct 28 00:30:44.469606 containerd[1625]: time="2025-10-28T00:30:44.469360468Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:44.470007 containerd[1625]: time="2025-10-28T00:30:44.469720266Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58378433" Oct 28 00:30:44.470868 containerd[1625]: time="2025-10-28T00:30:44.470835539Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:44.472913 containerd[1625]: time="2025-10-28T00:30:44.472871913Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:30:44.474369 containerd[1625]: time="2025-10-28T00:30:44.474347990Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 3.785155807s" Oct 28 00:30:44.474432 containerd[1625]: time="2025-10-28T00:30:44.474374418Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Oct 28 00:30:47.532495 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:47.532666 systemd[1]: kubelet.service: Consumed 100ms CPU time, 108.8M memory peak. Oct 28 00:30:47.536678 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:47.556644 systemd[1]: Reload requested from client PID 2427 ('systemctl') (unit session-9.scope)... Oct 28 00:30:47.556658 systemd[1]: Reloading... Oct 28 00:30:47.627610 zram_generator::config[2470]: No configuration found. Oct 28 00:30:47.703869 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:30:47.771459 systemd[1]: Reloading finished in 214 ms. Oct 28 00:30:47.808792 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Oct 28 00:30:47.808856 systemd[1]: kubelet.service: Failed with result 'signal'. Oct 28 00:30:47.809060 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:47.809098 systemd[1]: kubelet.service: Consumed 46ms CPU time, 64.6M memory peak. Oct 28 00:30:47.810263 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:48.175856 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:48.182877 (kubelet)[2538]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 28 00:30:48.229598 kubelet[2538]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 28 00:30:48.229598 kubelet[2538]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 28 00:30:48.229598 kubelet[2538]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 28 00:30:48.229598 kubelet[2538]: I1028 00:30:48.229441 2538 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 28 00:30:48.712976 kubelet[2538]: I1028 00:30:48.712948 2538 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Oct 28 00:30:48.712976 kubelet[2538]: I1028 00:30:48.712970 2538 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 28 00:30:48.713119 kubelet[2538]: I1028 00:30:48.713107 2538 server.go:956] "Client rotation is on, will bootstrap in background" Oct 28 00:30:48.743201 kubelet[2538]: E1028 00:30:48.742722 2538 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://139.178.70.100:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Oct 28 00:30:48.743201 kubelet[2538]: I1028 00:30:48.742969 2538 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 28 00:30:48.760800 kubelet[2538]: I1028 00:30:48.760782 2538 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 28 00:30:48.765754 kubelet[2538]: I1028 00:30:48.765729 2538 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 28 00:30:48.767237 kubelet[2538]: I1028 00:30:48.767201 2538 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 28 00:30:48.769783 kubelet[2538]: I1028 00:30:48.767235 2538 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 28 00:30:48.769919 kubelet[2538]: I1028 00:30:48.769792 2538 topology_manager.go:138] "Creating topology manager with none policy" Oct 28 00:30:48.769919 kubelet[2538]: I1028 00:30:48.769805 2538 container_manager_linux.go:303] "Creating device plugin manager" Oct 28 00:30:48.769981 kubelet[2538]: I1028 00:30:48.769927 2538 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:30:48.772980 kubelet[2538]: I1028 00:30:48.772729 2538 kubelet.go:480] "Attempting to sync node with API server" Oct 28 00:30:48.772980 kubelet[2538]: I1028 00:30:48.772748 2538 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 28 00:30:48.772980 kubelet[2538]: I1028 00:30:48.772773 2538 kubelet.go:386] "Adding apiserver pod source" Oct 28 00:30:48.772980 kubelet[2538]: I1028 00:30:48.772791 2538 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 28 00:30:48.778785 kubelet[2538]: E1028 00:30:48.778764 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 28 00:30:48.781584 kubelet[2538]: E1028 00:30:48.781361 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Oct 28 00:30:48.781584 kubelet[2538]: I1028 00:30:48.781438 2538 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 28 00:30:48.781901 kubelet[2538]: I1028 00:30:48.781893 2538 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 28 00:30:48.782908 kubelet[2538]: W1028 00:30:48.782895 2538 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Oct 28 00:30:48.788042 kubelet[2538]: I1028 00:30:48.788031 2538 watchdog_linux.go:99] "Systemd watchdog is not enabled" Oct 28 00:30:48.788128 kubelet[2538]: I1028 00:30:48.788123 2538 server.go:1289] "Started kubelet" Oct 28 00:30:48.789999 kubelet[2538]: I1028 00:30:48.789984 2538 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 28 00:30:48.791073 kubelet[2538]: I1028 00:30:48.791045 2538 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 28 00:30:48.791375 kubelet[2538]: I1028 00:30:48.791263 2538 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 28 00:30:48.792627 kubelet[2538]: I1028 00:30:48.792618 2538 server.go:317] "Adding debug handlers to kubelet server" Oct 28 00:30:48.797839 kubelet[2538]: I1028 00:30:48.797824 2538 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 28 00:30:48.800499 kubelet[2538]: E1028 00:30:48.796944 2538 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.100:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.100:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.187280477d469cff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-10-28 00:30:48.788098303 +0000 UTC m=+0.602897705,LastTimestamp:2025-10-28 00:30:48.788098303 +0000 UTC m=+0.602897705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Oct 28 00:30:48.803950 kubelet[2538]: I1028 00:30:48.803928 2538 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 28 00:30:48.808366 kubelet[2538]: I1028 00:30:48.808351 2538 volume_manager.go:297] "Starting Kubelet Volume Manager" Oct 28 00:30:48.808703 kubelet[2538]: E1028 00:30:48.808689 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:48.809453 kubelet[2538]: I1028 00:30:48.809323 2538 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Oct 28 00:30:48.809453 kubelet[2538]: I1028 00:30:48.809378 2538 reconciler.go:26] "Reconciler: start to sync state" Oct 28 00:30:48.810839 kubelet[2538]: E1028 00:30:48.810798 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 28 00:30:48.810894 kubelet[2538]: E1028 00:30:48.810849 2538 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="200ms" Oct 28 00:30:48.813439 kubelet[2538]: I1028 00:30:48.813296 2538 factory.go:223] Registration of the systemd container factory successfully Oct 28 00:30:48.813439 kubelet[2538]: I1028 00:30:48.813371 2538 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 28 00:30:48.815981 kubelet[2538]: E1028 00:30:48.814876 2538 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 28 00:30:48.815981 kubelet[2538]: I1028 00:30:48.815441 2538 factory.go:223] Registration of the containerd container factory successfully Oct 28 00:30:48.826317 kubelet[2538]: I1028 00:30:48.826238 2538 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Oct 28 00:30:48.827134 kubelet[2538]: I1028 00:30:48.827125 2538 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Oct 28 00:30:48.827189 kubelet[2538]: I1028 00:30:48.827184 2538 status_manager.go:230] "Starting to sync pod status with apiserver" Oct 28 00:30:48.827238 kubelet[2538]: I1028 00:30:48.827232 2538 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 28 00:30:48.827274 kubelet[2538]: I1028 00:30:48.827270 2538 kubelet.go:2436] "Starting kubelet main sync loop" Oct 28 00:30:48.827327 kubelet[2538]: E1028 00:30:48.827318 2538 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 28 00:30:48.831462 kubelet[2538]: E1028 00:30:48.831433 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Oct 28 00:30:48.844489 kubelet[2538]: I1028 00:30:48.844458 2538 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 28 00:30:48.844489 kubelet[2538]: I1028 00:30:48.844474 2538 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 28 00:30:48.844489 kubelet[2538]: I1028 00:30:48.844487 2538 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:30:48.847474 kubelet[2538]: I1028 00:30:48.847452 2538 policy_none.go:49] "None policy: Start" Oct 28 00:30:48.847474 kubelet[2538]: I1028 00:30:48.847475 2538 memory_manager.go:186] "Starting memorymanager" policy="None" Oct 28 00:30:48.847559 kubelet[2538]: I1028 00:30:48.847486 2538 state_mem.go:35] "Initializing new in-memory state store" Oct 28 00:30:48.851962 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Oct 28 00:30:48.864704 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Oct 28 00:30:48.866937 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Oct 28 00:30:48.877742 kubelet[2538]: E1028 00:30:48.877268 2538 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 28 00:30:48.877742 kubelet[2538]: I1028 00:30:48.877396 2538 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 28 00:30:48.877742 kubelet[2538]: I1028 00:30:48.877403 2538 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 28 00:30:48.877742 kubelet[2538]: I1028 00:30:48.877629 2538 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 28 00:30:48.878358 kubelet[2538]: E1028 00:30:48.878349 2538 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 28 00:30:48.878531 kubelet[2538]: E1028 00:30:48.878524 2538 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Oct 28 00:30:48.935642 systemd[1]: Created slice kubepods-burstable-pod839aa59a6e530d85d66830f0d7f10257.slice - libcontainer container kubepods-burstable-pod839aa59a6e530d85d66830f0d7f10257.slice. Oct 28 00:30:48.943058 kubelet[2538]: E1028 00:30:48.943039 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:48.944636 systemd[1]: Created slice kubepods-burstable-pod20c890a246d840d308022312da9174cb.slice - libcontainer container kubepods-burstable-pod20c890a246d840d308022312da9174cb.slice. Oct 28 00:30:48.946319 kubelet[2538]: W1028 00:30:48.946304 2538 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c890a246d840d308022312da9174cb.slice/cpuset.cpus.effective": open /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c890a246d840d308022312da9174cb.slice/cpuset.cpus.effective: no such device Oct 28 00:30:48.953428 kubelet[2538]: E1028 00:30:48.953406 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:48.955229 systemd[1]: Created slice kubepods-burstable-podd13d96f639b65e57f439b4396b605564.slice - libcontainer container kubepods-burstable-podd13d96f639b65e57f439b4396b605564.slice. Oct 28 00:30:48.956464 kubelet[2538]: E1028 00:30:48.956456 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:48.979615 kubelet[2538]: I1028 00:30:48.978685 2538 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:30:48.980000 kubelet[2538]: E1028 00:30:48.979944 2538 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.100:6443/api/v1/nodes\": dial tcp 139.178.70.100:6443: connect: connection refused" node="localhost" Oct 28 00:30:49.010618 kubelet[2538]: I1028 00:30:49.010374 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/839aa59a6e530d85d66830f0d7f10257-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"839aa59a6e530d85d66830f0d7f10257\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:49.010618 kubelet[2538]: I1028 00:30:49.010416 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/839aa59a6e530d85d66830f0d7f10257-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"839aa59a6e530d85d66830f0d7f10257\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:49.010618 kubelet[2538]: I1028 00:30:49.010430 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:49.010618 kubelet[2538]: I1028 00:30:49.010440 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:49.010618 kubelet[2538]: I1028 00:30:49.010458 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:49.010782 kubelet[2538]: I1028 00:30:49.010470 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:49.010782 kubelet[2538]: I1028 00:30:49.010478 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/839aa59a6e530d85d66830f0d7f10257-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"839aa59a6e530d85d66830f0d7f10257\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:49.010782 kubelet[2538]: I1028 00:30:49.010489 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:49.010782 kubelet[2538]: I1028 00:30:49.010500 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d13d96f639b65e57f439b4396b605564-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d13d96f639b65e57f439b4396b605564\") " pod="kube-system/kube-scheduler-localhost" Oct 28 00:30:49.011129 kubelet[2538]: E1028 00:30:49.011109 2538 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="400ms" Oct 28 00:30:49.181824 kubelet[2538]: I1028 00:30:49.181796 2538 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:30:49.182079 kubelet[2538]: E1028 00:30:49.182040 2538 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.100:6443/api/v1/nodes\": dial tcp 139.178.70.100:6443: connect: connection refused" node="localhost" Oct 28 00:30:49.244232 containerd[1625]: time="2025-10-28T00:30:49.244153355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:839aa59a6e530d85d66830f0d7f10257,Namespace:kube-system,Attempt:0,}" Oct 28 00:30:49.259920 containerd[1625]: time="2025-10-28T00:30:49.259670794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:20c890a246d840d308022312da9174cb,Namespace:kube-system,Attempt:0,}" Oct 28 00:30:49.272834 containerd[1625]: time="2025-10-28T00:30:49.272813848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d13d96f639b65e57f439b4396b605564,Namespace:kube-system,Attempt:0,}" Oct 28 00:30:49.315887 containerd[1625]: time="2025-10-28T00:30:49.315855575Z" level=info msg="connecting to shim 146e51cf6440b8185bbf7cf08d48b746eeff1096f1c503228e5d0b107ce934a0" address="unix:///run/containerd/s/b8c6a8d504d8b5120a02436758f757292f28a1ce72092268d1589e1a5b491e1f" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:30:49.326869 containerd[1625]: time="2025-10-28T00:30:49.326850406Z" level=info msg="connecting to shim e46ea38d7e385562703bf25b680fb33ef3111caf823054112d07b1f8ae404625" address="unix:///run/containerd/s/6ba0c818d9e7a062b3427e0c19693f9b0a57b04baa66297a15bcb282b9da1f73" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:30:49.327313 containerd[1625]: time="2025-10-28T00:30:49.327292169Z" level=info msg="connecting to shim c6ec62a93446cf1247a94978698e2966faf15c7fc9ee2293318da8b968d96eba" address="unix:///run/containerd/s/5bc93eb5e44690b5827fc2fd0cbdaaf59e945d8411744a0f39a50219c2f7254e" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:30:49.412173 kubelet[2538]: E1028 00:30:49.412137 2538 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="800ms" Oct 28 00:30:49.498379 systemd[1]: Started cri-containerd-c6ec62a93446cf1247a94978698e2966faf15c7fc9ee2293318da8b968d96eba.scope - libcontainer container c6ec62a93446cf1247a94978698e2966faf15c7fc9ee2293318da8b968d96eba. Oct 28 00:30:49.499756 systemd[1]: Started cri-containerd-e46ea38d7e385562703bf25b680fb33ef3111caf823054112d07b1f8ae404625.scope - libcontainer container e46ea38d7e385562703bf25b680fb33ef3111caf823054112d07b1f8ae404625. Oct 28 00:30:49.511810 systemd[1]: Started cri-containerd-146e51cf6440b8185bbf7cf08d48b746eeff1096f1c503228e5d0b107ce934a0.scope - libcontainer container 146e51cf6440b8185bbf7cf08d48b746eeff1096f1c503228e5d0b107ce934a0. Oct 28 00:30:49.583890 kubelet[2538]: I1028 00:30:49.583858 2538 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:30:49.584213 kubelet[2538]: E1028 00:30:49.584189 2538 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.100:6443/api/v1/nodes\": dial tcp 139.178.70.100:6443: connect: connection refused" node="localhost" Oct 28 00:30:49.677544 containerd[1625]: time="2025-10-28T00:30:49.677457911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:839aa59a6e530d85d66830f0d7f10257,Namespace:kube-system,Attempt:0,} returns sandbox id \"146e51cf6440b8185bbf7cf08d48b746eeff1096f1c503228e5d0b107ce934a0\"" Oct 28 00:30:49.678248 containerd[1625]: time="2025-10-28T00:30:49.678203449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:20c890a246d840d308022312da9174cb,Namespace:kube-system,Attempt:0,} returns sandbox id \"c6ec62a93446cf1247a94978698e2966faf15c7fc9ee2293318da8b968d96eba\"" Oct 28 00:30:49.679193 containerd[1625]: time="2025-10-28T00:30:49.679174473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d13d96f639b65e57f439b4396b605564,Namespace:kube-system,Attempt:0,} returns sandbox id \"e46ea38d7e385562703bf25b680fb33ef3111caf823054112d07b1f8ae404625\"" Oct 28 00:30:49.681424 containerd[1625]: time="2025-10-28T00:30:49.681357334Z" level=info msg="CreateContainer within sandbox \"146e51cf6440b8185bbf7cf08d48b746eeff1096f1c503228e5d0b107ce934a0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Oct 28 00:30:49.683470 containerd[1625]: time="2025-10-28T00:30:49.683068776Z" level=info msg="CreateContainer within sandbox \"c6ec62a93446cf1247a94978698e2966faf15c7fc9ee2293318da8b968d96eba\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Oct 28 00:30:49.684714 containerd[1625]: time="2025-10-28T00:30:49.684695582Z" level=info msg="CreateContainer within sandbox \"e46ea38d7e385562703bf25b680fb33ef3111caf823054112d07b1f8ae404625\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Oct 28 00:30:49.691840 containerd[1625]: time="2025-10-28T00:30:49.691813868Z" level=info msg="Container dbcaba7bfabc4b016796061f9bab129bd249967782a25d6031dab724590b9b65: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:30:49.692035 containerd[1625]: time="2025-10-28T00:30:49.691823555Z" level=info msg="Container 3ee33d81a0b316e2f432d3c2dba4cfa4b142cce0fc30621cd552fcc3e938df3b: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:30:49.692155 containerd[1625]: time="2025-10-28T00:30:49.692140171Z" level=info msg="Container 1da1112613abbe99910744e6da8f788640e1cc5b5734a7f245989505fbda3a85: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:30:49.701547 containerd[1625]: time="2025-10-28T00:30:49.701454074Z" level=info msg="CreateContainer within sandbox \"146e51cf6440b8185bbf7cf08d48b746eeff1096f1c503228e5d0b107ce934a0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1da1112613abbe99910744e6da8f788640e1cc5b5734a7f245989505fbda3a85\"" Oct 28 00:30:49.702595 containerd[1625]: time="2025-10-28T00:30:49.702534102Z" level=info msg="StartContainer for \"1da1112613abbe99910744e6da8f788640e1cc5b5734a7f245989505fbda3a85\"" Oct 28 00:30:49.708970 containerd[1625]: time="2025-10-28T00:30:49.708917363Z" level=info msg="CreateContainer within sandbox \"e46ea38d7e385562703bf25b680fb33ef3111caf823054112d07b1f8ae404625\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3ee33d81a0b316e2f432d3c2dba4cfa4b142cce0fc30621cd552fcc3e938df3b\"" Oct 28 00:30:49.709054 containerd[1625]: time="2025-10-28T00:30:49.709038545Z" level=info msg="connecting to shim 1da1112613abbe99910744e6da8f788640e1cc5b5734a7f245989505fbda3a85" address="unix:///run/containerd/s/b8c6a8d504d8b5120a02436758f757292f28a1ce72092268d1589e1a5b491e1f" protocol=ttrpc version=3 Oct 28 00:30:49.709773 containerd[1625]: time="2025-10-28T00:30:49.709437422Z" level=info msg="CreateContainer within sandbox \"c6ec62a93446cf1247a94978698e2966faf15c7fc9ee2293318da8b968d96eba\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"dbcaba7bfabc4b016796061f9bab129bd249967782a25d6031dab724590b9b65\"" Oct 28 00:30:49.709911 containerd[1625]: time="2025-10-28T00:30:49.709862517Z" level=info msg="StartContainer for \"dbcaba7bfabc4b016796061f9bab129bd249967782a25d6031dab724590b9b65\"" Oct 28 00:30:49.709971 containerd[1625]: time="2025-10-28T00:30:49.709866875Z" level=info msg="StartContainer for \"3ee33d81a0b316e2f432d3c2dba4cfa4b142cce0fc30621cd552fcc3e938df3b\"" Oct 28 00:30:49.710440 containerd[1625]: time="2025-10-28T00:30:49.710409649Z" level=info msg="connecting to shim dbcaba7bfabc4b016796061f9bab129bd249967782a25d6031dab724590b9b65" address="unix:///run/containerd/s/5bc93eb5e44690b5827fc2fd0cbdaaf59e945d8411744a0f39a50219c2f7254e" protocol=ttrpc version=3 Oct 28 00:30:49.710709 containerd[1625]: time="2025-10-28T00:30:49.710697919Z" level=info msg="connecting to shim 3ee33d81a0b316e2f432d3c2dba4cfa4b142cce0fc30621cd552fcc3e938df3b" address="unix:///run/containerd/s/6ba0c818d9e7a062b3427e0c19693f9b0a57b04baa66297a15bcb282b9da1f73" protocol=ttrpc version=3 Oct 28 00:30:49.724725 systemd[1]: Started cri-containerd-1da1112613abbe99910744e6da8f788640e1cc5b5734a7f245989505fbda3a85.scope - libcontainer container 1da1112613abbe99910744e6da8f788640e1cc5b5734a7f245989505fbda3a85. Oct 28 00:30:49.728119 systemd[1]: Started cri-containerd-3ee33d81a0b316e2f432d3c2dba4cfa4b142cce0fc30621cd552fcc3e938df3b.scope - libcontainer container 3ee33d81a0b316e2f432d3c2dba4cfa4b142cce0fc30621cd552fcc3e938df3b. Oct 28 00:30:49.740401 kubelet[2538]: E1028 00:30:49.740371 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 28 00:30:49.740729 systemd[1]: Started cri-containerd-dbcaba7bfabc4b016796061f9bab129bd249967782a25d6031dab724590b9b65.scope - libcontainer container dbcaba7bfabc4b016796061f9bab129bd249967782a25d6031dab724590b9b65. Oct 28 00:30:49.773332 kubelet[2538]: E1028 00:30:49.772896 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 28 00:30:49.780125 containerd[1625]: time="2025-10-28T00:30:49.780050692Z" level=info msg="StartContainer for \"1da1112613abbe99910744e6da8f788640e1cc5b5734a7f245989505fbda3a85\" returns successfully" Oct 28 00:30:49.797749 containerd[1625]: time="2025-10-28T00:30:49.797694975Z" level=info msg="StartContainer for \"dbcaba7bfabc4b016796061f9bab129bd249967782a25d6031dab724590b9b65\" returns successfully" Oct 28 00:30:49.804663 containerd[1625]: time="2025-10-28T00:30:49.804638931Z" level=info msg="StartContainer for \"3ee33d81a0b316e2f432d3c2dba4cfa4b142cce0fc30621cd552fcc3e938df3b\" returns successfully" Oct 28 00:30:49.850202 kubelet[2538]: E1028 00:30:49.849867 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:49.853164 kubelet[2538]: E1028 00:30:49.852931 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:49.855512 kubelet[2538]: E1028 00:30:49.855332 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:50.020070 kubelet[2538]: E1028 00:30:50.020044 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Oct 28 00:30:50.198212 kubelet[2538]: E1028 00:30:50.198184 2538 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Oct 28 00:30:50.212803 kubelet[2538]: E1028 00:30:50.212777 2538 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="1.6s" Oct 28 00:30:50.385471 kubelet[2538]: I1028 00:30:50.385449 2538 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:30:50.855682 kubelet[2538]: E1028 00:30:50.855664 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:50.856055 kubelet[2538]: E1028 00:30:50.856042 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:51.499681 kubelet[2538]: I1028 00:30:51.499658 2538 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Oct 28 00:30:51.499681 kubelet[2538]: E1028 00:30:51.499682 2538 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Oct 28 00:30:51.512446 kubelet[2538]: E1028 00:30:51.512424 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:51.612564 kubelet[2538]: E1028 00:30:51.612529 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:51.713216 kubelet[2538]: E1028 00:30:51.713182 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:51.814051 kubelet[2538]: E1028 00:30:51.813264 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:51.858192 kubelet[2538]: E1028 00:30:51.858135 2538 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:30:51.914180 kubelet[2538]: E1028 00:30:51.914147 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.014704 kubelet[2538]: E1028 00:30:52.014672 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.115850 kubelet[2538]: E1028 00:30:52.115774 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.216613 kubelet[2538]: E1028 00:30:52.216556 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.316961 kubelet[2538]: E1028 00:30:52.316919 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.418417 kubelet[2538]: E1028 00:30:52.417895 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.518264 kubelet[2538]: E1028 00:30:52.518234 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.619280 kubelet[2538]: E1028 00:30:52.619244 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.719715 kubelet[2538]: E1028 00:30:52.719687 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.819887 kubelet[2538]: E1028 00:30:52.819852 2538 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:30:52.910872 kubelet[2538]: I1028 00:30:52.910818 2538 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:52.918100 kubelet[2538]: I1028 00:30:52.918078 2538 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:52.920694 kubelet[2538]: I1028 00:30:52.920568 2538 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 28 00:30:53.260425 systemd[1]: Reload requested from client PID 2815 ('systemctl') (unit session-9.scope)... Oct 28 00:30:53.260435 systemd[1]: Reloading... Oct 28 00:30:53.325591 zram_generator::config[2865]: No configuration found. Oct 28 00:30:53.402171 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:30:53.479137 systemd[1]: Reloading finished in 218 ms. Oct 28 00:30:53.506662 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:53.519862 systemd[1]: kubelet.service: Deactivated successfully. Oct 28 00:30:53.520118 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:53.520188 systemd[1]: kubelet.service: Consumed 684ms CPU time, 127M memory peak. Oct 28 00:30:53.521713 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:30:53.925664 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:30:53.938979 (kubelet)[2926]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 28 00:30:54.093241 kubelet[2926]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 28 00:30:54.093445 kubelet[2926]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 28 00:30:54.093471 kubelet[2926]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 28 00:30:54.093534 kubelet[2926]: I1028 00:30:54.093520 2926 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 28 00:30:54.096986 kubelet[2926]: I1028 00:30:54.096965 2926 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Oct 28 00:30:54.096986 kubelet[2926]: I1028 00:30:54.096981 2926 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 28 00:30:54.097120 kubelet[2926]: I1028 00:30:54.097108 2926 server.go:956] "Client rotation is on, will bootstrap in background" Oct 28 00:30:54.097816 kubelet[2926]: I1028 00:30:54.097805 2926 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Oct 28 00:30:54.099365 kubelet[2926]: I1028 00:30:54.099176 2926 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 28 00:30:54.103260 kubelet[2926]: I1028 00:30:54.103244 2926 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 28 00:30:54.104766 kubelet[2926]: I1028 00:30:54.104752 2926 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 28 00:30:54.104877 kubelet[2926]: I1028 00:30:54.104858 2926 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 28 00:30:54.104962 kubelet[2926]: I1028 00:30:54.104875 2926 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 28 00:30:54.105016 kubelet[2926]: I1028 00:30:54.104965 2926 topology_manager.go:138] "Creating topology manager with none policy" Oct 28 00:30:54.105016 kubelet[2926]: I1028 00:30:54.104971 2926 container_manager_linux.go:303] "Creating device plugin manager" Oct 28 00:30:54.111754 kubelet[2926]: I1028 00:30:54.111735 2926 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:30:54.111901 kubelet[2926]: I1028 00:30:54.111891 2926 kubelet.go:480] "Attempting to sync node with API server" Oct 28 00:30:54.111923 kubelet[2926]: I1028 00:30:54.111902 2926 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 28 00:30:54.111923 kubelet[2926]: I1028 00:30:54.111917 2926 kubelet.go:386] "Adding apiserver pod source" Oct 28 00:30:54.111956 kubelet[2926]: I1028 00:30:54.111928 2926 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 28 00:30:54.114474 kubelet[2926]: I1028 00:30:54.114382 2926 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 28 00:30:54.114740 kubelet[2926]: I1028 00:30:54.114682 2926 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 28 00:30:54.117585 kubelet[2926]: I1028 00:30:54.116303 2926 watchdog_linux.go:99] "Systemd watchdog is not enabled" Oct 28 00:30:54.117585 kubelet[2926]: I1028 00:30:54.116327 2926 server.go:1289] "Started kubelet" Oct 28 00:30:54.117585 kubelet[2926]: I1028 00:30:54.117180 2926 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 28 00:30:54.128076 kubelet[2926]: I1028 00:30:54.128048 2926 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 28 00:30:54.129510 kubelet[2926]: I1028 00:30:54.129032 2926 volume_manager.go:297] "Starting Kubelet Volume Manager" Oct 28 00:30:54.129510 kubelet[2926]: I1028 00:30:54.129151 2926 server.go:317] "Adding debug handlers to kubelet server" Oct 28 00:30:54.129510 kubelet[2926]: I1028 00:30:54.129318 2926 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Oct 28 00:30:54.129510 kubelet[2926]: I1028 00:30:54.129378 2926 reconciler.go:26] "Reconciler: start to sync state" Oct 28 00:30:54.130325 kubelet[2926]: E1028 00:30:54.130311 2926 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 28 00:30:54.131127 kubelet[2926]: I1028 00:30:54.131010 2926 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 28 00:30:54.132304 kubelet[2926]: I1028 00:30:54.132286 2926 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 28 00:30:54.132373 kubelet[2926]: I1028 00:30:54.132366 2926 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 28 00:30:54.142772 kubelet[2926]: I1028 00:30:54.142738 2926 factory.go:223] Registration of the containerd container factory successfully Oct 28 00:30:54.142772 kubelet[2926]: I1028 00:30:54.142750 2926 factory.go:223] Registration of the systemd container factory successfully Oct 28 00:30:54.142875 kubelet[2926]: I1028 00:30:54.142793 2926 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 28 00:30:54.145862 kubelet[2926]: I1028 00:30:54.145839 2926 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Oct 28 00:30:54.146477 kubelet[2926]: I1028 00:30:54.146462 2926 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Oct 28 00:30:54.146477 kubelet[2926]: I1028 00:30:54.146474 2926 status_manager.go:230] "Starting to sync pod status with apiserver" Oct 28 00:30:54.146530 kubelet[2926]: I1028 00:30:54.146487 2926 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 28 00:30:54.146530 kubelet[2926]: I1028 00:30:54.146490 2926 kubelet.go:2436] "Starting kubelet main sync loop" Oct 28 00:30:54.146530 kubelet[2926]: E1028 00:30:54.146512 2926 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176779 2926 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176796 2926 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176808 2926 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176893 2926 state_mem.go:88] "Updated default CPUSet" cpuSet="" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176900 2926 state_mem.go:96] "Updated CPUSet assignments" assignments={} Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176909 2926 policy_none.go:49] "None policy: Start" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176915 2926 memory_manager.go:186] "Starting memorymanager" policy="None" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176921 2926 state_mem.go:35] "Initializing new in-memory state store" Oct 28 00:30:54.177042 kubelet[2926]: I1028 00:30:54.176985 2926 state_mem.go:75] "Updated machine memory state" Oct 28 00:30:54.180242 kubelet[2926]: E1028 00:30:54.180228 2926 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 28 00:30:54.180337 kubelet[2926]: I1028 00:30:54.180326 2926 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 28 00:30:54.180362 kubelet[2926]: I1028 00:30:54.180336 2926 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 28 00:30:54.180815 kubelet[2926]: I1028 00:30:54.180803 2926 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 28 00:30:54.182781 kubelet[2926]: E1028 00:30:54.182619 2926 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 28 00:30:54.247742 kubelet[2926]: I1028 00:30:54.247657 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:54.248377 kubelet[2926]: I1028 00:30:54.248315 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 28 00:30:54.249073 kubelet[2926]: I1028 00:30:54.248984 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:54.252704 kubelet[2926]: E1028 00:30:54.252687 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Oct 28 00:30:54.253152 kubelet[2926]: E1028 00:30:54.253078 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:54.253152 kubelet[2926]: E1028 00:30:54.253114 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:54.283990 kubelet[2926]: I1028 00:30:54.283975 2926 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:30:54.287719 kubelet[2926]: I1028 00:30:54.287700 2926 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Oct 28 00:30:54.287805 kubelet[2926]: I1028 00:30:54.287749 2926 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Oct 28 00:30:54.331001 kubelet[2926]: I1028 00:30:54.330978 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d13d96f639b65e57f439b4396b605564-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d13d96f639b65e57f439b4396b605564\") " pod="kube-system/kube-scheduler-localhost" Oct 28 00:30:54.331001 kubelet[2926]: I1028 00:30:54.331005 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/839aa59a6e530d85d66830f0d7f10257-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"839aa59a6e530d85d66830f0d7f10257\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:54.331107 kubelet[2926]: I1028 00:30:54.331017 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/839aa59a6e530d85d66830f0d7f10257-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"839aa59a6e530d85d66830f0d7f10257\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:54.331107 kubelet[2926]: I1028 00:30:54.331030 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/839aa59a6e530d85d66830f0d7f10257-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"839aa59a6e530d85d66830f0d7f10257\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:54.331107 kubelet[2926]: I1028 00:30:54.331043 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:54.331107 kubelet[2926]: I1028 00:30:54.331062 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:54.331107 kubelet[2926]: I1028 00:30:54.331073 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:54.331208 kubelet[2926]: I1028 00:30:54.331089 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:54.331208 kubelet[2926]: I1028 00:30:54.331101 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:30:55.113064 kubelet[2926]: I1028 00:30:55.112940 2926 apiserver.go:52] "Watching apiserver" Oct 28 00:30:55.129869 kubelet[2926]: I1028 00:30:55.129843 2926 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Oct 28 00:30:55.167212 kubelet[2926]: I1028 00:30:55.167009 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 28 00:30:55.167405 kubelet[2926]: I1028 00:30:55.167395 2926 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:55.175362 kubelet[2926]: E1028 00:30:55.175162 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Oct 28 00:30:55.175889 kubelet[2926]: E1028 00:30:55.175873 2926 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Oct 28 00:30:55.184129 kubelet[2926]: I1028 00:30:55.184089 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.184067793 podStartE2EDuration="3.184067793s" podCreationTimestamp="2025-10-28 00:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:30:55.184049645 +0000 UTC m=+1.122771212" watchObservedRunningTime="2025-10-28 00:30:55.184067793 +0000 UTC m=+1.122789350" Oct 28 00:30:55.187793 kubelet[2926]: I1028 00:30:55.187697 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.187687584 podStartE2EDuration="3.187687584s" podCreationTimestamp="2025-10-28 00:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:30:55.187409898 +0000 UTC m=+1.126131463" watchObservedRunningTime="2025-10-28 00:30:55.187687584 +0000 UTC m=+1.126409143" Oct 28 00:30:59.965729 kubelet[2926]: I1028 00:30:59.965616 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=7.965588886 podStartE2EDuration="7.965588886s" podCreationTimestamp="2025-10-28 00:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:30:55.191229994 +0000 UTC m=+1.129951554" watchObservedRunningTime="2025-10-28 00:30:59.965588886 +0000 UTC m=+5.904310452" Oct 28 00:31:00.610341 kubelet[2926]: I1028 00:31:00.610303 2926 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Oct 28 00:31:00.610629 containerd[1625]: time="2025-10-28T00:31:00.610566432Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Oct 28 00:31:00.610938 kubelet[2926]: I1028 00:31:00.610769 2926 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Oct 28 00:31:01.451497 systemd[1]: Created slice kubepods-besteffort-pod23852d2c_1ef5_46f8_976e_a1a63bfa0fdc.slice - libcontainer container kubepods-besteffort-pod23852d2c_1ef5_46f8_976e_a1a63bfa0fdc.slice. Oct 28 00:31:01.478716 kubelet[2926]: I1028 00:31:01.478687 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/23852d2c-1ef5-46f8-976e-a1a63bfa0fdc-xtables-lock\") pod \"kube-proxy-pskvc\" (UID: \"23852d2c-1ef5-46f8-976e-a1a63bfa0fdc\") " pod="kube-system/kube-proxy-pskvc" Oct 28 00:31:01.478716 kubelet[2926]: I1028 00:31:01.478716 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/23852d2c-1ef5-46f8-976e-a1a63bfa0fdc-kube-proxy\") pod \"kube-proxy-pskvc\" (UID: \"23852d2c-1ef5-46f8-976e-a1a63bfa0fdc\") " pod="kube-system/kube-proxy-pskvc" Oct 28 00:31:01.478967 kubelet[2926]: I1028 00:31:01.478730 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23852d2c-1ef5-46f8-976e-a1a63bfa0fdc-lib-modules\") pod \"kube-proxy-pskvc\" (UID: \"23852d2c-1ef5-46f8-976e-a1a63bfa0fdc\") " pod="kube-system/kube-proxy-pskvc" Oct 28 00:31:01.478967 kubelet[2926]: I1028 00:31:01.478739 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rd62\" (UniqueName: \"kubernetes.io/projected/23852d2c-1ef5-46f8-976e-a1a63bfa0fdc-kube-api-access-8rd62\") pod \"kube-proxy-pskvc\" (UID: \"23852d2c-1ef5-46f8-976e-a1a63bfa0fdc\") " pod="kube-system/kube-proxy-pskvc" Oct 28 00:31:01.767312 containerd[1625]: time="2025-10-28T00:31:01.766068467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pskvc,Uid:23852d2c-1ef5-46f8-976e-a1a63bfa0fdc,Namespace:kube-system,Attempt:0,}" Oct 28 00:31:01.831535 containerd[1625]: time="2025-10-28T00:31:01.831499062Z" level=info msg="connecting to shim 7cc69f3082bcc54654df8ab13a3ecc154c16bda6c61cb1dbb2990efcb364a47a" address="unix:///run/containerd/s/6662ae97cec3f352b4c8d35d3641886852932ffb73827ef18784bb347b1f3b0f" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:01.847925 systemd[1]: Created slice kubepods-besteffort-poda5e80a98_680c_469a_a3bd_f1c85e1d5700.slice - libcontainer container kubepods-besteffort-poda5e80a98_680c_469a_a3bd_f1c85e1d5700.slice. Oct 28 00:31:01.859731 systemd[1]: Started cri-containerd-7cc69f3082bcc54654df8ab13a3ecc154c16bda6c61cb1dbb2990efcb364a47a.scope - libcontainer container 7cc69f3082bcc54654df8ab13a3ecc154c16bda6c61cb1dbb2990efcb364a47a. Oct 28 00:31:01.880434 kubelet[2926]: I1028 00:31:01.880413 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a5e80a98-680c-469a-a3bd-f1c85e1d5700-var-lib-calico\") pod \"tigera-operator-7dcd859c48-mm6rb\" (UID: \"a5e80a98-680c-469a-a3bd-f1c85e1d5700\") " pod="tigera-operator/tigera-operator-7dcd859c48-mm6rb" Oct 28 00:31:01.880511 kubelet[2926]: I1028 00:31:01.880440 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvrqc\" (UniqueName: \"kubernetes.io/projected/a5e80a98-680c-469a-a3bd-f1c85e1d5700-kube-api-access-fvrqc\") pod \"tigera-operator-7dcd859c48-mm6rb\" (UID: \"a5e80a98-680c-469a-a3bd-f1c85e1d5700\") " pod="tigera-operator/tigera-operator-7dcd859c48-mm6rb" Oct 28 00:31:01.883627 containerd[1625]: time="2025-10-28T00:31:01.883607928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pskvc,Uid:23852d2c-1ef5-46f8-976e-a1a63bfa0fdc,Namespace:kube-system,Attempt:0,} returns sandbox id \"7cc69f3082bcc54654df8ab13a3ecc154c16bda6c61cb1dbb2990efcb364a47a\"" Oct 28 00:31:01.886058 containerd[1625]: time="2025-10-28T00:31:01.886038534Z" level=info msg="CreateContainer within sandbox \"7cc69f3082bcc54654df8ab13a3ecc154c16bda6c61cb1dbb2990efcb364a47a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Oct 28 00:31:01.892460 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3140687655.mount: Deactivated successfully. Oct 28 00:31:01.892550 containerd[1625]: time="2025-10-28T00:31:01.892473411Z" level=info msg="Container 2c2a54d4785da82604a3ccecfc54593354947ab9fb0ab2736202c01344f03d0f: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:01.896381 containerd[1625]: time="2025-10-28T00:31:01.896363352Z" level=info msg="CreateContainer within sandbox \"7cc69f3082bcc54654df8ab13a3ecc154c16bda6c61cb1dbb2990efcb364a47a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"2c2a54d4785da82604a3ccecfc54593354947ab9fb0ab2736202c01344f03d0f\"" Oct 28 00:31:01.897557 containerd[1625]: time="2025-10-28T00:31:01.897545045Z" level=info msg="StartContainer for \"2c2a54d4785da82604a3ccecfc54593354947ab9fb0ab2736202c01344f03d0f\"" Oct 28 00:31:01.898579 containerd[1625]: time="2025-10-28T00:31:01.898502847Z" level=info msg="connecting to shim 2c2a54d4785da82604a3ccecfc54593354947ab9fb0ab2736202c01344f03d0f" address="unix:///run/containerd/s/6662ae97cec3f352b4c8d35d3641886852932ffb73827ef18784bb347b1f3b0f" protocol=ttrpc version=3 Oct 28 00:31:01.908658 systemd[1]: Started cri-containerd-2c2a54d4785da82604a3ccecfc54593354947ab9fb0ab2736202c01344f03d0f.scope - libcontainer container 2c2a54d4785da82604a3ccecfc54593354947ab9fb0ab2736202c01344f03d0f. Oct 28 00:31:01.929656 containerd[1625]: time="2025-10-28T00:31:01.929629943Z" level=info msg="StartContainer for \"2c2a54d4785da82604a3ccecfc54593354947ab9fb0ab2736202c01344f03d0f\" returns successfully" Oct 28 00:31:02.151477 containerd[1625]: time="2025-10-28T00:31:02.151395796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mm6rb,Uid:a5e80a98-680c-469a-a3bd-f1c85e1d5700,Namespace:tigera-operator,Attempt:0,}" Oct 28 00:31:02.167944 containerd[1625]: time="2025-10-28T00:31:02.167919114Z" level=info msg="connecting to shim 7d14bc09345331d159c3aed8edcf3ba56f159e7f80be64a7de88290698d0408e" address="unix:///run/containerd/s/fa8576412c2415c7b33b7ea13bfdad292eada99c4b03fd132bf6f611f7d315a6" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:02.192667 systemd[1]: Started cri-containerd-7d14bc09345331d159c3aed8edcf3ba56f159e7f80be64a7de88290698d0408e.scope - libcontainer container 7d14bc09345331d159c3aed8edcf3ba56f159e7f80be64a7de88290698d0408e. Oct 28 00:31:02.195252 kubelet[2926]: I1028 00:31:02.195212 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-pskvc" podStartSLOduration=1.195200368 podStartE2EDuration="1.195200368s" podCreationTimestamp="2025-10-28 00:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:31:02.195136862 +0000 UTC m=+8.133858428" watchObservedRunningTime="2025-10-28 00:31:02.195200368 +0000 UTC m=+8.133921928" Oct 28 00:31:02.230007 containerd[1625]: time="2025-10-28T00:31:02.229942143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mm6rb,Uid:a5e80a98-680c-469a-a3bd-f1c85e1d5700,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7d14bc09345331d159c3aed8edcf3ba56f159e7f80be64a7de88290698d0408e\"" Oct 28 00:31:02.231125 containerd[1625]: time="2025-10-28T00:31:02.231076666Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Oct 28 00:31:03.842980 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1904126516.mount: Deactivated successfully. Oct 28 00:31:05.531669 containerd[1625]: time="2025-10-28T00:31:05.531627356Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:05.532293 containerd[1625]: time="2025-10-28T00:31:05.532187845Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Oct 28 00:31:05.532606 containerd[1625]: time="2025-10-28T00:31:05.532588864Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:05.534224 containerd[1625]: time="2025-10-28T00:31:05.534197275Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:05.534768 containerd[1625]: time="2025-10-28T00:31:05.534743403Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 3.303650692s" Oct 28 00:31:05.534768 containerd[1625]: time="2025-10-28T00:31:05.534765917Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Oct 28 00:31:05.543386 containerd[1625]: time="2025-10-28T00:31:05.543337716Z" level=info msg="CreateContainer within sandbox \"7d14bc09345331d159c3aed8edcf3ba56f159e7f80be64a7de88290698d0408e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Oct 28 00:31:05.547423 containerd[1625]: time="2025-10-28T00:31:05.547399286Z" level=info msg="Container 62dcedea7fc28cea64b5111914d8e4f013f23f9921bf81d1a213e9b4095de3ec: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:05.552584 containerd[1625]: time="2025-10-28T00:31:05.552531611Z" level=info msg="CreateContainer within sandbox \"7d14bc09345331d159c3aed8edcf3ba56f159e7f80be64a7de88290698d0408e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"62dcedea7fc28cea64b5111914d8e4f013f23f9921bf81d1a213e9b4095de3ec\"" Oct 28 00:31:05.553045 containerd[1625]: time="2025-10-28T00:31:05.553032175Z" level=info msg="StartContainer for \"62dcedea7fc28cea64b5111914d8e4f013f23f9921bf81d1a213e9b4095de3ec\"" Oct 28 00:31:05.553528 containerd[1625]: time="2025-10-28T00:31:05.553513566Z" level=info msg="connecting to shim 62dcedea7fc28cea64b5111914d8e4f013f23f9921bf81d1a213e9b4095de3ec" address="unix:///run/containerd/s/fa8576412c2415c7b33b7ea13bfdad292eada99c4b03fd132bf6f611f7d315a6" protocol=ttrpc version=3 Oct 28 00:31:05.569661 systemd[1]: Started cri-containerd-62dcedea7fc28cea64b5111914d8e4f013f23f9921bf81d1a213e9b4095de3ec.scope - libcontainer container 62dcedea7fc28cea64b5111914d8e4f013f23f9921bf81d1a213e9b4095de3ec. Oct 28 00:31:05.588220 containerd[1625]: time="2025-10-28T00:31:05.588192081Z" level=info msg="StartContainer for \"62dcedea7fc28cea64b5111914d8e4f013f23f9921bf81d1a213e9b4095de3ec\" returns successfully" Oct 28 00:31:07.965260 kubelet[2926]: I1028 00:31:07.965221 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-mm6rb" podStartSLOduration=3.66051408 podStartE2EDuration="6.965210259s" podCreationTimestamp="2025-10-28 00:31:01 +0000 UTC" firstStartedPulling="2025-10-28 00:31:02.230671346 +0000 UTC m=+8.169392905" lastFinishedPulling="2025-10-28 00:31:05.535367523 +0000 UTC m=+11.474089084" observedRunningTime="2025-10-28 00:31:06.19360624 +0000 UTC m=+12.132327815" watchObservedRunningTime="2025-10-28 00:31:07.965210259 +0000 UTC m=+13.903931825" Oct 28 00:31:10.801264 sudo[1937]: pam_unix(sudo:session): session closed for user root Oct 28 00:31:10.802562 sshd[1936]: Connection closed by 139.178.89.65 port 42600 Oct 28 00:31:10.803802 sshd-session[1933]: pam_unix(sshd:session): session closed for user core Oct 28 00:31:10.805680 systemd[1]: sshd@6-139.178.70.100:22-139.178.89.65:42600.service: Deactivated successfully. Oct 28 00:31:10.807260 systemd[1]: session-9.scope: Deactivated successfully. Oct 28 00:31:10.808811 systemd[1]: session-9.scope: Consumed 3.982s CPU time, 152.8M memory peak. Oct 28 00:31:10.811631 systemd-logind[1597]: Session 9 logged out. Waiting for processes to exit. Oct 28 00:31:10.813944 systemd-logind[1597]: Removed session 9. Oct 28 00:31:14.647587 systemd[1]: Created slice kubepods-besteffort-pod619457ae_efe5_46ea_86f2_718fbefa5756.slice - libcontainer container kubepods-besteffort-pod619457ae_efe5_46ea_86f2_718fbefa5756.slice. Oct 28 00:31:14.669745 kubelet[2926]: I1028 00:31:14.669671 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/619457ae-efe5-46ea-86f2-718fbefa5756-tigera-ca-bundle\") pod \"calico-typha-548c6cd86d-7lt2h\" (UID: \"619457ae-efe5-46ea-86f2-718fbefa5756\") " pod="calico-system/calico-typha-548c6cd86d-7lt2h" Oct 28 00:31:14.669745 kubelet[2926]: I1028 00:31:14.669697 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/619457ae-efe5-46ea-86f2-718fbefa5756-typha-certs\") pod \"calico-typha-548c6cd86d-7lt2h\" (UID: \"619457ae-efe5-46ea-86f2-718fbefa5756\") " pod="calico-system/calico-typha-548c6cd86d-7lt2h" Oct 28 00:31:14.669745 kubelet[2926]: I1028 00:31:14.669709 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltbct\" (UniqueName: \"kubernetes.io/projected/619457ae-efe5-46ea-86f2-718fbefa5756-kube-api-access-ltbct\") pod \"calico-typha-548c6cd86d-7lt2h\" (UID: \"619457ae-efe5-46ea-86f2-718fbefa5756\") " pod="calico-system/calico-typha-548c6cd86d-7lt2h" Oct 28 00:31:14.864582 systemd[1]: Created slice kubepods-besteffort-pod1e4ccaaf_7f0f_488b_9ab6_2ccf4f77a170.slice - libcontainer container kubepods-besteffort-pod1e4ccaaf_7f0f_488b_9ab6_2ccf4f77a170.slice. Oct 28 00:31:14.871509 kubelet[2926]: I1028 00:31:14.871095 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-cni-bin-dir\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871509 kubelet[2926]: I1028 00:31:14.871120 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-tigera-ca-bundle\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871509 kubelet[2926]: I1028 00:31:14.871135 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-cni-log-dir\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871509 kubelet[2926]: I1028 00:31:14.871150 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-var-lib-calico\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871509 kubelet[2926]: I1028 00:31:14.871161 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-flexvol-driver-host\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871716 kubelet[2926]: I1028 00:31:14.871174 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-var-run-calico\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871716 kubelet[2926]: I1028 00:31:14.871188 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f2nk\" (UniqueName: \"kubernetes.io/projected/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-kube-api-access-5f2nk\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871716 kubelet[2926]: I1028 00:31:14.871200 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-cni-net-dir\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871716 kubelet[2926]: I1028 00:31:14.871208 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-lib-modules\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871716 kubelet[2926]: I1028 00:31:14.871221 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-xtables-lock\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871843 kubelet[2926]: I1028 00:31:14.871235 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-node-certs\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.871843 kubelet[2926]: I1028 00:31:14.871250 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170-policysync\") pod \"calico-node-f4gjv\" (UID: \"1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170\") " pod="calico-system/calico-node-f4gjv" Oct 28 00:31:14.951553 containerd[1625]: time="2025-10-28T00:31:14.951515195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-548c6cd86d-7lt2h,Uid:619457ae-efe5-46ea-86f2-718fbefa5756,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:14.982005 kubelet[2926]: E1028 00:31:14.981896 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:14.982005 kubelet[2926]: W1028 00:31:14.981909 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:14.986422 kubelet[2926]: E1028 00:31:14.986375 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.030079 kubelet[2926]: E1028 00:31:15.029808 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:15.037067 containerd[1625]: time="2025-10-28T00:31:15.037015780Z" level=info msg="connecting to shim 8609a8c1ce1f3ad65ba8f4e730cd6d98beaa2ad947f166b355de7c692c40c9c1" address="unix:///run/containerd/s/a7e45583036832b0c436a816bf1a3b8f0dad0e23fdce986e8c4989f24af691e2" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:15.059802 systemd[1]: Started cri-containerd-8609a8c1ce1f3ad65ba8f4e730cd6d98beaa2ad947f166b355de7c692c40c9c1.scope - libcontainer container 8609a8c1ce1f3ad65ba8f4e730cd6d98beaa2ad947f166b355de7c692c40c9c1. Oct 28 00:31:15.061749 kubelet[2926]: E1028 00:31:15.061737 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.061885 kubelet[2926]: W1028 00:31:15.061773 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.061885 kubelet[2926]: E1028 00:31:15.061785 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.062088 kubelet[2926]: E1028 00:31:15.062056 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.062088 kubelet[2926]: W1028 00:31:15.062062 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.062088 kubelet[2926]: E1028 00:31:15.062067 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.062644 kubelet[2926]: E1028 00:31:15.062602 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.062644 kubelet[2926]: W1028 00:31:15.062609 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.062644 kubelet[2926]: E1028 00:31:15.062615 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.062877 kubelet[2926]: E1028 00:31:15.062841 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.062877 kubelet[2926]: W1028 00:31:15.062847 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.062877 kubelet[2926]: E1028 00:31:15.062853 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.063050 kubelet[2926]: E1028 00:31:15.063045 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.063126 kubelet[2926]: W1028 00:31:15.063093 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.063126 kubelet[2926]: E1028 00:31:15.063106 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.071088 kubelet[2926]: E1028 00:31:15.063437 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.071088 kubelet[2926]: W1028 00:31:15.063442 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.071088 kubelet[2926]: E1028 00:31:15.063453 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.071088 kubelet[2926]: E1028 00:31:15.063839 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.071088 kubelet[2926]: W1028 00:31:15.063844 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.071088 kubelet[2926]: E1028 00:31:15.063850 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.071088 kubelet[2926]: E1028 00:31:15.064601 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.071088 kubelet[2926]: W1028 00:31:15.064607 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.071088 kubelet[2926]: E1028 00:31:15.064613 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.071088 kubelet[2926]: E1028 00:31:15.064717 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.077131 kubelet[2926]: W1028 00:31:15.064722 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.077131 kubelet[2926]: E1028 00:31:15.064727 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.077131 kubelet[2926]: E1028 00:31:15.064813 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.077131 kubelet[2926]: W1028 00:31:15.064818 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.077131 kubelet[2926]: E1028 00:31:15.064823 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.077131 kubelet[2926]: E1028 00:31:15.064919 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.077131 kubelet[2926]: W1028 00:31:15.064934 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.077131 kubelet[2926]: E1028 00:31:15.064940 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.077131 kubelet[2926]: E1028 00:31:15.065031 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.077131 kubelet[2926]: W1028 00:31:15.065035 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.077294 kubelet[2926]: E1028 00:31:15.065041 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.077294 kubelet[2926]: E1028 00:31:15.065135 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.077294 kubelet[2926]: W1028 00:31:15.065140 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.077294 kubelet[2926]: E1028 00:31:15.065144 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.077294 kubelet[2926]: E1028 00:31:15.065268 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.077294 kubelet[2926]: W1028 00:31:15.065273 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.077294 kubelet[2926]: E1028 00:31:15.065278 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.077294 kubelet[2926]: E1028 00:31:15.065356 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.077294 kubelet[2926]: W1028 00:31:15.065361 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.077294 kubelet[2926]: E1028 00:31:15.065366 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.081866 kubelet[2926]: E1028 00:31:15.065457 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.081866 kubelet[2926]: W1028 00:31:15.065462 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.081866 kubelet[2926]: E1028 00:31:15.065466 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.081866 kubelet[2926]: E1028 00:31:15.065557 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.081866 kubelet[2926]: W1028 00:31:15.065562 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.081866 kubelet[2926]: E1028 00:31:15.065603 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.081866 kubelet[2926]: E1028 00:31:15.066901 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.081866 kubelet[2926]: W1028 00:31:15.066907 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.081866 kubelet[2926]: E1028 00:31:15.066913 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.081866 kubelet[2926]: E1028 00:31:15.067034 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082030 kubelet[2926]: W1028 00:31:15.067038 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082030 kubelet[2926]: E1028 00:31:15.067043 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082030 kubelet[2926]: E1028 00:31:15.067157 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082030 kubelet[2926]: W1028 00:31:15.067162 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082030 kubelet[2926]: E1028 00:31:15.067167 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082030 kubelet[2926]: E1028 00:31:15.072517 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082030 kubelet[2926]: W1028 00:31:15.072526 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082030 kubelet[2926]: E1028 00:31:15.072536 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082030 kubelet[2926]: I1028 00:31:15.072914 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd5600ff-882d-4fd0-9a0a-4d9435b64027-socket-dir\") pod \"csi-node-driver-p9vxm\" (UID: \"bd5600ff-882d-4fd0-9a0a-4d9435b64027\") " pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:15.082164 kubelet[2926]: E1028 00:31:15.073019 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082164 kubelet[2926]: W1028 00:31:15.073025 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082164 kubelet[2926]: E1028 00:31:15.073031 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082164 kubelet[2926]: I1028 00:31:15.073042 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdkr\" (UniqueName: \"kubernetes.io/projected/bd5600ff-882d-4fd0-9a0a-4d9435b64027-kube-api-access-ftdkr\") pod \"csi-node-driver-p9vxm\" (UID: \"bd5600ff-882d-4fd0-9a0a-4d9435b64027\") " pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:15.082164 kubelet[2926]: E1028 00:31:15.073124 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082164 kubelet[2926]: W1028 00:31:15.073129 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082164 kubelet[2926]: E1028 00:31:15.073134 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082164 kubelet[2926]: I1028 00:31:15.073144 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd5600ff-882d-4fd0-9a0a-4d9435b64027-kubelet-dir\") pod \"csi-node-driver-p9vxm\" (UID: \"bd5600ff-882d-4fd0-9a0a-4d9435b64027\") " pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:15.082164 kubelet[2926]: E1028 00:31:15.073224 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082299 kubelet[2926]: W1028 00:31:15.073229 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082299 kubelet[2926]: E1028 00:31:15.073233 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082299 kubelet[2926]: I1028 00:31:15.073241 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/bd5600ff-882d-4fd0-9a0a-4d9435b64027-varrun\") pod \"csi-node-driver-p9vxm\" (UID: \"bd5600ff-882d-4fd0-9a0a-4d9435b64027\") " pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:15.082299 kubelet[2926]: E1028 00:31:15.073323 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082299 kubelet[2926]: W1028 00:31:15.073328 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082299 kubelet[2926]: E1028 00:31:15.073333 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082299 kubelet[2926]: I1028 00:31:15.073342 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd5600ff-882d-4fd0-9a0a-4d9435b64027-registration-dir\") pod \"csi-node-driver-p9vxm\" (UID: \"bd5600ff-882d-4fd0-9a0a-4d9435b64027\") " pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:15.082299 kubelet[2926]: E1028 00:31:15.073424 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082417 kubelet[2926]: W1028 00:31:15.073429 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082417 kubelet[2926]: E1028 00:31:15.073634 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082417 kubelet[2926]: E1028 00:31:15.073718 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082417 kubelet[2926]: W1028 00:31:15.073722 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082417 kubelet[2926]: E1028 00:31:15.073727 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082417 kubelet[2926]: E1028 00:31:15.073827 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082417 kubelet[2926]: W1028 00:31:15.073837 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082417 kubelet[2926]: E1028 00:31:15.073842 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082417 kubelet[2926]: E1028 00:31:15.073917 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082417 kubelet[2926]: W1028 00:31:15.073922 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082597 kubelet[2926]: E1028 00:31:15.073927 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082597 kubelet[2926]: E1028 00:31:15.074268 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082597 kubelet[2926]: W1028 00:31:15.074275 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082597 kubelet[2926]: E1028 00:31:15.074294 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082597 kubelet[2926]: E1028 00:31:15.074370 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082597 kubelet[2926]: W1028 00:31:15.074374 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082597 kubelet[2926]: E1028 00:31:15.074379 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082597 kubelet[2926]: E1028 00:31:15.074447 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082597 kubelet[2926]: W1028 00:31:15.074451 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082597 kubelet[2926]: E1028 00:31:15.074456 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082752 kubelet[2926]: E1028 00:31:15.074560 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082752 kubelet[2926]: W1028 00:31:15.074565 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082752 kubelet[2926]: E1028 00:31:15.074577 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082752 kubelet[2926]: E1028 00:31:15.074685 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082752 kubelet[2926]: W1028 00:31:15.074690 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082752 kubelet[2926]: E1028 00:31:15.074694 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.082752 kubelet[2926]: E1028 00:31:15.074787 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.082752 kubelet[2926]: W1028 00:31:15.074791 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.082752 kubelet[2926]: E1028 00:31:15.074796 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.116413 containerd[1625]: time="2025-10-28T00:31:15.116385115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-548c6cd86d-7lt2h,Uid:619457ae-efe5-46ea-86f2-718fbefa5756,Namespace:calico-system,Attempt:0,} returns sandbox id \"8609a8c1ce1f3ad65ba8f4e730cd6d98beaa2ad947f166b355de7c692c40c9c1\"" Oct 28 00:31:15.118368 containerd[1625]: time="2025-10-28T00:31:15.117736410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Oct 28 00:31:15.167797 containerd[1625]: time="2025-10-28T00:31:15.167725835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-f4gjv,Uid:1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:15.175091 kubelet[2926]: E1028 00:31:15.175062 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.175091 kubelet[2926]: W1028 00:31:15.175074 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.175091 kubelet[2926]: E1028 00:31:15.175088 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.176603 kubelet[2926]: E1028 00:31:15.175736 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.176603 kubelet[2926]: W1028 00:31:15.175746 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.176603 kubelet[2926]: E1028 00:31:15.175898 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.177715 kubelet[2926]: E1028 00:31:15.177234 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.177715 kubelet[2926]: W1028 00:31:15.177247 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.177715 kubelet[2926]: E1028 00:31:15.177259 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.177715 kubelet[2926]: E1028 00:31:15.177414 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.177715 kubelet[2926]: W1028 00:31:15.177419 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.177715 kubelet[2926]: E1028 00:31:15.177424 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.177715 kubelet[2926]: E1028 00:31:15.177587 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.177715 kubelet[2926]: W1028 00:31:15.177593 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.177715 kubelet[2926]: E1028 00:31:15.177606 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.177978 kubelet[2926]: E1028 00:31:15.177780 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.177978 kubelet[2926]: W1028 00:31:15.177785 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.177978 kubelet[2926]: E1028 00:31:15.177790 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.178408 kubelet[2926]: E1028 00:31:15.178112 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.178408 kubelet[2926]: W1028 00:31:15.178117 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.178408 kubelet[2926]: E1028 00:31:15.178123 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.178636 kubelet[2926]: E1028 00:31:15.178622 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.178636 kubelet[2926]: W1028 00:31:15.178632 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.178706 kubelet[2926]: E1028 00:31:15.178644 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.180015 kubelet[2926]: E1028 00:31:15.180001 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.180059 kubelet[2926]: W1028 00:31:15.180028 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.180059 kubelet[2926]: E1028 00:31:15.180037 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.180226 kubelet[2926]: E1028 00:31:15.180185 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.180226 kubelet[2926]: W1028 00:31:15.180192 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.180226 kubelet[2926]: E1028 00:31:15.180198 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.180596 kubelet[2926]: E1028 00:31:15.180567 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.180646 kubelet[2926]: W1028 00:31:15.180639 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.180667 kubelet[2926]: E1028 00:31:15.180647 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.181126 kubelet[2926]: E1028 00:31:15.181113 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.181126 kubelet[2926]: W1028 00:31:15.181123 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.181183 kubelet[2926]: E1028 00:31:15.181132 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.182364 kubelet[2926]: E1028 00:31:15.182344 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.182364 kubelet[2926]: W1028 00:31:15.182352 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.182364 kubelet[2926]: E1028 00:31:15.182359 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.182524 kubelet[2926]: E1028 00:31:15.182511 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.182524 kubelet[2926]: W1028 00:31:15.182518 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.183217 kubelet[2926]: E1028 00:31:15.182526 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.183217 kubelet[2926]: E1028 00:31:15.183003 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.183217 kubelet[2926]: W1028 00:31:15.183010 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.183217 kubelet[2926]: E1028 00:31:15.183019 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.183565 kubelet[2926]: E1028 00:31:15.183553 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.183565 kubelet[2926]: W1028 00:31:15.183564 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.183696 kubelet[2926]: E1028 00:31:15.183680 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.184630 kubelet[2926]: E1028 00:31:15.183974 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.184630 kubelet[2926]: W1028 00:31:15.183984 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.184630 kubelet[2926]: E1028 00:31:15.183992 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.184630 kubelet[2926]: E1028 00:31:15.184438 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.184630 kubelet[2926]: W1028 00:31:15.184444 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.184630 kubelet[2926]: E1028 00:31:15.184451 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.184856 kubelet[2926]: E1028 00:31:15.184844 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.184856 kubelet[2926]: W1028 00:31:15.184852 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.184909 kubelet[2926]: E1028 00:31:15.184859 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.185191 kubelet[2926]: E1028 00:31:15.185085 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.185191 kubelet[2926]: W1028 00:31:15.185094 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.185191 kubelet[2926]: E1028 00:31:15.185102 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.185974 containerd[1625]: time="2025-10-28T00:31:15.185868174Z" level=info msg="connecting to shim e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549" address="unix:///run/containerd/s/9c3487752fb07dd19f93bd2d9ff275fd350afb0bdbcb8f38dd4812c308a55666" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:15.187243 kubelet[2926]: E1028 00:31:15.187193 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.187243 kubelet[2926]: W1028 00:31:15.187203 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.187243 kubelet[2926]: E1028 00:31:15.187213 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.187613 kubelet[2926]: E1028 00:31:15.187408 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.187613 kubelet[2926]: W1028 00:31:15.187413 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.187613 kubelet[2926]: E1028 00:31:15.187419 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.188138 kubelet[2926]: E1028 00:31:15.187736 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.188138 kubelet[2926]: W1028 00:31:15.187961 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.188138 kubelet[2926]: E1028 00:31:15.187967 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.188647 kubelet[2926]: E1028 00:31:15.188635 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.188647 kubelet[2926]: W1028 00:31:15.188644 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.188719 kubelet[2926]: E1028 00:31:15.188652 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.189064 kubelet[2926]: E1028 00:31:15.189051 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.189064 kubelet[2926]: W1028 00:31:15.189059 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.189064 kubelet[2926]: E1028 00:31:15.189065 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.191776 kubelet[2926]: E1028 00:31:15.191682 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:15.191924 kubelet[2926]: W1028 00:31:15.191835 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:15.192264 kubelet[2926]: E1028 00:31:15.192117 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:15.213696 systemd[1]: Started cri-containerd-e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549.scope - libcontainer container e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549. Oct 28 00:31:15.236809 containerd[1625]: time="2025-10-28T00:31:15.236782204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-f4gjv,Uid:1e4ccaaf-7f0f-488b-9ab6-2ccf4f77a170,Namespace:calico-system,Attempt:0,} returns sandbox id \"e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549\"" Oct 28 00:31:16.570185 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1816574991.mount: Deactivated successfully. Oct 28 00:31:17.147817 kubelet[2926]: E1028 00:31:17.147782 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:17.532301 containerd[1625]: time="2025-10-28T00:31:17.532186923Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:17.533373 containerd[1625]: time="2025-10-28T00:31:17.532778903Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Oct 28 00:31:17.534207 containerd[1625]: time="2025-10-28T00:31:17.533867992Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.415740937s" Oct 28 00:31:17.534207 containerd[1625]: time="2025-10-28T00:31:17.533887269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Oct 28 00:31:17.536114 containerd[1625]: time="2025-10-28T00:31:17.536100056Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Oct 28 00:31:17.540669 containerd[1625]: time="2025-10-28T00:31:17.540623078Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:17.541291 containerd[1625]: time="2025-10-28T00:31:17.541244048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:17.580516 containerd[1625]: time="2025-10-28T00:31:17.580495254Z" level=info msg="CreateContainer within sandbox \"8609a8c1ce1f3ad65ba8f4e730cd6d98beaa2ad947f166b355de7c692c40c9c1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Oct 28 00:31:17.585293 containerd[1625]: time="2025-10-28T00:31:17.585266277Z" level=info msg="Container 32a36e0b6e0e00f308c1718495a710fc51674f8aa3482688069160f98557b4b0: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:17.590531 containerd[1625]: time="2025-10-28T00:31:17.590504476Z" level=info msg="CreateContainer within sandbox \"8609a8c1ce1f3ad65ba8f4e730cd6d98beaa2ad947f166b355de7c692c40c9c1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"32a36e0b6e0e00f308c1718495a710fc51674f8aa3482688069160f98557b4b0\"" Oct 28 00:31:17.591913 containerd[1625]: time="2025-10-28T00:31:17.591889968Z" level=info msg="StartContainer for \"32a36e0b6e0e00f308c1718495a710fc51674f8aa3482688069160f98557b4b0\"" Oct 28 00:31:17.594061 containerd[1625]: time="2025-10-28T00:31:17.593938587Z" level=info msg="connecting to shim 32a36e0b6e0e00f308c1718495a710fc51674f8aa3482688069160f98557b4b0" address="unix:///run/containerd/s/a7e45583036832b0c436a816bf1a3b8f0dad0e23fdce986e8c4989f24af691e2" protocol=ttrpc version=3 Oct 28 00:31:17.643701 systemd[1]: Started cri-containerd-32a36e0b6e0e00f308c1718495a710fc51674f8aa3482688069160f98557b4b0.scope - libcontainer container 32a36e0b6e0e00f308c1718495a710fc51674f8aa3482688069160f98557b4b0. Oct 28 00:31:17.685086 containerd[1625]: time="2025-10-28T00:31:17.684239177Z" level=info msg="StartContainer for \"32a36e0b6e0e00f308c1718495a710fc51674f8aa3482688069160f98557b4b0\" returns successfully" Oct 28 00:31:18.252620 kubelet[2926]: I1028 00:31:18.251380 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-548c6cd86d-7lt2h" podStartSLOduration=1.8341831929999999 podStartE2EDuration="4.251369513s" podCreationTimestamp="2025-10-28 00:31:14 +0000 UTC" firstStartedPulling="2025-10-28 00:31:15.11733933 +0000 UTC m=+21.056060886" lastFinishedPulling="2025-10-28 00:31:17.534525651 +0000 UTC m=+23.473247206" observedRunningTime="2025-10-28 00:31:18.248561206 +0000 UTC m=+24.187282771" watchObservedRunningTime="2025-10-28 00:31:18.251369513 +0000 UTC m=+24.190091079" Oct 28 00:31:18.289931 kubelet[2926]: E1028 00:31:18.289905 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.289931 kubelet[2926]: W1028 00:31:18.289925 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.289931 kubelet[2926]: E1028 00:31:18.289943 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.290502 kubelet[2926]: E1028 00:31:18.290047 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.290502 kubelet[2926]: W1028 00:31:18.290053 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.290502 kubelet[2926]: E1028 00:31:18.290062 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.290502 kubelet[2926]: E1028 00:31:18.290151 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.290502 kubelet[2926]: W1028 00:31:18.290156 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.290502 kubelet[2926]: E1028 00:31:18.290161 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.290502 kubelet[2926]: E1028 00:31:18.290267 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.290502 kubelet[2926]: W1028 00:31:18.290272 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.290502 kubelet[2926]: E1028 00:31:18.290294 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.290502 kubelet[2926]: E1028 00:31:18.290384 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.290805 kubelet[2926]: W1028 00:31:18.290390 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.290805 kubelet[2926]: E1028 00:31:18.290396 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.290928 kubelet[2926]: E1028 00:31:18.290910 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.290928 kubelet[2926]: W1028 00:31:18.290921 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.290980 kubelet[2926]: E1028 00:31:18.290930 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291029 kubelet[2926]: E1028 00:31:18.291018 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291029 kubelet[2926]: W1028 00:31:18.291026 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291073 kubelet[2926]: E1028 00:31:18.291031 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291128 kubelet[2926]: E1028 00:31:18.291116 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291128 kubelet[2926]: W1028 00:31:18.291124 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291176 kubelet[2926]: E1028 00:31:18.291129 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291221 kubelet[2926]: E1028 00:31:18.291208 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291221 kubelet[2926]: W1028 00:31:18.291218 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291281 kubelet[2926]: E1028 00:31:18.291224 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291590 kubelet[2926]: E1028 00:31:18.291293 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291590 kubelet[2926]: W1028 00:31:18.291298 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291590 kubelet[2926]: E1028 00:31:18.291302 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291590 kubelet[2926]: E1028 00:31:18.291371 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291590 kubelet[2926]: W1028 00:31:18.291377 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291590 kubelet[2926]: E1028 00:31:18.291384 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291590 kubelet[2926]: E1028 00:31:18.291465 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291590 kubelet[2926]: W1028 00:31:18.291469 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291590 kubelet[2926]: E1028 00:31:18.291475 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291813 kubelet[2926]: E1028 00:31:18.291605 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291813 kubelet[2926]: W1028 00:31:18.291610 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291813 kubelet[2926]: E1028 00:31:18.291614 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291813 kubelet[2926]: E1028 00:31:18.291688 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291813 kubelet[2926]: W1028 00:31:18.291694 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291813 kubelet[2926]: E1028 00:31:18.291699 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.291813 kubelet[2926]: E1028 00:31:18.291776 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.291813 kubelet[2926]: W1028 00:31:18.291780 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.291813 kubelet[2926]: E1028 00:31:18.291786 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.301197 kubelet[2926]: E1028 00:31:18.301177 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.301197 kubelet[2926]: W1028 00:31:18.301191 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.301280 kubelet[2926]: E1028 00:31:18.301204 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.308621 kubelet[2926]: E1028 00:31:18.308602 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.308621 kubelet[2926]: W1028 00:31:18.308618 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.308726 kubelet[2926]: E1028 00:31:18.308632 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.308802 kubelet[2926]: E1028 00:31:18.308791 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.308802 kubelet[2926]: W1028 00:31:18.308799 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.308929 kubelet[2926]: E1028 00:31:18.308805 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.308988 kubelet[2926]: E1028 00:31:18.308977 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.309023 kubelet[2926]: W1028 00:31:18.309017 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.309082 kubelet[2926]: E1028 00:31:18.309076 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.309273 kubelet[2926]: E1028 00:31:18.309211 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.309273 kubelet[2926]: W1028 00:31:18.309218 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.309273 kubelet[2926]: E1028 00:31:18.309225 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.309368 kubelet[2926]: E1028 00:31:18.309363 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.309401 kubelet[2926]: W1028 00:31:18.309396 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.309433 kubelet[2926]: E1028 00:31:18.309428 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.354803 kubelet[2926]: E1028 00:31:18.354743 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.354803 kubelet[2926]: W1028 00:31:18.354759 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.354803 kubelet[2926]: E1028 00:31:18.354773 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.355145 kubelet[2926]: E1028 00:31:18.355099 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.355145 kubelet[2926]: W1028 00:31:18.355107 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.355145 kubelet[2926]: E1028 00:31:18.355114 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.355341 kubelet[2926]: E1028 00:31:18.355296 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.355341 kubelet[2926]: W1028 00:31:18.355304 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.355341 kubelet[2926]: E1028 00:31:18.355310 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.355522 kubelet[2926]: E1028 00:31:18.355516 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.355607 kubelet[2926]: W1028 00:31:18.355560 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.355607 kubelet[2926]: E1028 00:31:18.355569 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.355790 kubelet[2926]: E1028 00:31:18.355742 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.355790 kubelet[2926]: W1028 00:31:18.355749 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.355790 kubelet[2926]: E1028 00:31:18.355756 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.355940 kubelet[2926]: E1028 00:31:18.355935 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.356022 kubelet[2926]: W1028 00:31:18.355969 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.356022 kubelet[2926]: E1028 00:31:18.355979 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.356179 kubelet[2926]: E1028 00:31:18.356171 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.356310 kubelet[2926]: W1028 00:31:18.356212 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.356310 kubelet[2926]: E1028 00:31:18.356221 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.356396 kubelet[2926]: E1028 00:31:18.356386 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.356423 kubelet[2926]: W1028 00:31:18.356396 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.356423 kubelet[2926]: E1028 00:31:18.356405 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.356506 kubelet[2926]: E1028 00:31:18.356484 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.356506 kubelet[2926]: W1028 00:31:18.356493 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.356506 kubelet[2926]: E1028 00:31:18.356499 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.356614 kubelet[2926]: E1028 00:31:18.356606 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.356614 kubelet[2926]: W1028 00:31:18.356614 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.356660 kubelet[2926]: E1028 00:31:18.356619 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.356873 kubelet[2926]: E1028 00:31:18.356861 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.356873 kubelet[2926]: W1028 00:31:18.356871 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.356960 kubelet[2926]: E1028 00:31:18.356878 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:18.356988 kubelet[2926]: E1028 00:31:18.356973 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:18.356988 kubelet[2926]: W1028 00:31:18.356980 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:18.357024 kubelet[2926]: E1028 00:31:18.356987 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.147591 kubelet[2926]: E1028 00:31:19.147544 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:19.217825 kubelet[2926]: I1028 00:31:19.217797 2926 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 28 00:31:19.299112 kubelet[2926]: E1028 00:31:19.299084 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.299112 kubelet[2926]: W1028 00:31:19.299104 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.299409 kubelet[2926]: E1028 00:31:19.299120 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.299409 kubelet[2926]: E1028 00:31:19.299238 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.299409 kubelet[2926]: W1028 00:31:19.299244 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.299409 kubelet[2926]: E1028 00:31:19.299255 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.299409 kubelet[2926]: E1028 00:31:19.299356 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.299409 kubelet[2926]: W1028 00:31:19.299361 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.299409 kubelet[2926]: E1028 00:31:19.299368 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.311844 kubelet[2926]: E1028 00:31:19.299466 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.311844 kubelet[2926]: W1028 00:31:19.299472 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.311844 kubelet[2926]: E1028 00:31:19.299478 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.311844 kubelet[2926]: E1028 00:31:19.299617 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.311844 kubelet[2926]: W1028 00:31:19.299623 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.311844 kubelet[2926]: E1028 00:31:19.299629 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.311844 kubelet[2926]: E1028 00:31:19.299744 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.311844 kubelet[2926]: W1028 00:31:19.299750 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.311844 kubelet[2926]: E1028 00:31:19.299756 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.311844 kubelet[2926]: E1028 00:31:19.299872 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312065 kubelet[2926]: W1028 00:31:19.299885 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312065 kubelet[2926]: E1028 00:31:19.299891 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312065 kubelet[2926]: E1028 00:31:19.299999 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312065 kubelet[2926]: W1028 00:31:19.300006 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312065 kubelet[2926]: E1028 00:31:19.300012 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312065 kubelet[2926]: E1028 00:31:19.300123 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312065 kubelet[2926]: W1028 00:31:19.300128 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312065 kubelet[2926]: E1028 00:31:19.300134 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312065 kubelet[2926]: E1028 00:31:19.300240 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312065 kubelet[2926]: W1028 00:31:19.300246 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312267 kubelet[2926]: E1028 00:31:19.300251 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312267 kubelet[2926]: E1028 00:31:19.300363 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312267 kubelet[2926]: W1028 00:31:19.300369 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312267 kubelet[2926]: E1028 00:31:19.300374 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312267 kubelet[2926]: E1028 00:31:19.300485 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312267 kubelet[2926]: W1028 00:31:19.300490 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312267 kubelet[2926]: E1028 00:31:19.300496 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312267 kubelet[2926]: E1028 00:31:19.300619 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312267 kubelet[2926]: W1028 00:31:19.300625 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312267 kubelet[2926]: E1028 00:31:19.300635 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312464 kubelet[2926]: E1028 00:31:19.300771 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312464 kubelet[2926]: W1028 00:31:19.300777 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312464 kubelet[2926]: E1028 00:31:19.300783 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.312464 kubelet[2926]: E1028 00:31:19.300889 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.312464 kubelet[2926]: W1028 00:31:19.300894 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.312464 kubelet[2926]: E1028 00:31:19.300900 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.317222 kubelet[2926]: E1028 00:31:19.317206 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.317222 kubelet[2926]: W1028 00:31:19.317218 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.317290 kubelet[2926]: E1028 00:31:19.317226 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.317366 kubelet[2926]: E1028 00:31:19.317355 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.317366 kubelet[2926]: W1028 00:31:19.317364 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.317422 kubelet[2926]: E1028 00:31:19.317371 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.317534 kubelet[2926]: E1028 00:31:19.317500 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.317534 kubelet[2926]: W1028 00:31:19.317532 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.317611 kubelet[2926]: E1028 00:31:19.317540 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.317711 kubelet[2926]: E1028 00:31:19.317699 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.317711 kubelet[2926]: W1028 00:31:19.317708 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.317768 kubelet[2926]: E1028 00:31:19.317715 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.317848 kubelet[2926]: E1028 00:31:19.317836 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.317848 kubelet[2926]: W1028 00:31:19.317846 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.317906 kubelet[2926]: E1028 00:31:19.317854 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.317984 kubelet[2926]: E1028 00:31:19.317973 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.317984 kubelet[2926]: W1028 00:31:19.317982 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.318037 kubelet[2926]: E1028 00:31:19.317988 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.318132 kubelet[2926]: E1028 00:31:19.318119 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.318132 kubelet[2926]: W1028 00:31:19.318128 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.318183 kubelet[2926]: E1028 00:31:19.318135 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.318375 kubelet[2926]: E1028 00:31:19.318313 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.318375 kubelet[2926]: W1028 00:31:19.318322 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.318375 kubelet[2926]: E1028 00:31:19.318330 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.318447 kubelet[2926]: E1028 00:31:19.318419 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.318447 kubelet[2926]: W1028 00:31:19.318424 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.318447 kubelet[2926]: E1028 00:31:19.318430 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.318610 kubelet[2926]: E1028 00:31:19.318514 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.318610 kubelet[2926]: W1028 00:31:19.318522 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.318610 kubelet[2926]: E1028 00:31:19.318528 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.318732 kubelet[2926]: E1028 00:31:19.318723 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.318784 kubelet[2926]: W1028 00:31:19.318777 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.318827 kubelet[2926]: E1028 00:31:19.318820 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.319108 kubelet[2926]: E1028 00:31:19.319037 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.319108 kubelet[2926]: W1028 00:31:19.319046 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.319108 kubelet[2926]: E1028 00:31:19.319053 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.319247 kubelet[2926]: E1028 00:31:19.319234 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.319247 kubelet[2926]: W1028 00:31:19.319243 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.319311 kubelet[2926]: E1028 00:31:19.319250 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.319341 kubelet[2926]: E1028 00:31:19.319334 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.319341 kubelet[2926]: W1028 00:31:19.319339 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.319387 kubelet[2926]: E1028 00:31:19.319345 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.319558 kubelet[2926]: E1028 00:31:19.319440 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.319558 kubelet[2926]: W1028 00:31:19.319449 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.319558 kubelet[2926]: E1028 00:31:19.319455 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.319558 kubelet[2926]: E1028 00:31:19.319551 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.319558 kubelet[2926]: W1028 00:31:19.319557 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.319756 kubelet[2926]: E1028 00:31:19.319563 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.319807 kubelet[2926]: E1028 00:31:19.319783 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.319807 kubelet[2926]: W1028 00:31:19.319804 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.319882 kubelet[2926]: E1028 00:31:19.319811 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.319930 kubelet[2926]: E1028 00:31:19.319919 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:31:19.319930 kubelet[2926]: W1028 00:31:19.319925 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:31:19.319985 kubelet[2926]: E1028 00:31:19.319931 2926 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:31:19.398210 containerd[1625]: time="2025-10-28T00:31:19.397788973Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:19.398210 containerd[1625]: time="2025-10-28T00:31:19.398154211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Oct 28 00:31:19.398425 containerd[1625]: time="2025-10-28T00:31:19.398359818Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:19.399529 containerd[1625]: time="2025-10-28T00:31:19.399511187Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.863323485s" Oct 28 00:31:19.399564 containerd[1625]: time="2025-10-28T00:31:19.399530356Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Oct 28 00:31:19.400674 containerd[1625]: time="2025-10-28T00:31:19.400480256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:19.407997 containerd[1625]: time="2025-10-28T00:31:19.407973299Z" level=info msg="CreateContainer within sandbox \"e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Oct 28 00:31:19.411498 containerd[1625]: time="2025-10-28T00:31:19.411478882Z" level=info msg="Container ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:19.415457 containerd[1625]: time="2025-10-28T00:31:19.415438108Z" level=info msg="CreateContainer within sandbox \"e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47\"" Oct 28 00:31:19.416767 containerd[1625]: time="2025-10-28T00:31:19.416053490Z" level=info msg="StartContainer for \"ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47\"" Oct 28 00:31:19.418109 containerd[1625]: time="2025-10-28T00:31:19.418032372Z" level=info msg="connecting to shim ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47" address="unix:///run/containerd/s/9c3487752fb07dd19f93bd2d9ff275fd350afb0bdbcb8f38dd4812c308a55666" protocol=ttrpc version=3 Oct 28 00:31:19.434715 systemd[1]: Started cri-containerd-ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47.scope - libcontainer container ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47. Oct 28 00:31:19.465908 containerd[1625]: time="2025-10-28T00:31:19.465872592Z" level=info msg="StartContainer for \"ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47\" returns successfully" Oct 28 00:31:19.476812 systemd[1]: cri-containerd-ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47.scope: Deactivated successfully. Oct 28 00:31:19.477035 systemd[1]: cri-containerd-ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47.scope: Consumed 17ms CPU time, 6.2M memory peak, 3.8M written to disk. Oct 28 00:31:19.488478 containerd[1625]: time="2025-10-28T00:31:19.488452681Z" level=info msg="received exit event container_id:\"ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47\" id:\"ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47\" pid:3634 exited_at:{seconds:1761611479 nanos:479350476}" Oct 28 00:31:19.502718 containerd[1625]: time="2025-10-28T00:31:19.502690382Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47\" id:\"ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47\" pid:3634 exited_at:{seconds:1761611479 nanos:479350476}" Oct 28 00:31:19.520000 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ea7bd8a71296fe46d2a33c4bc0806b88eaaa71b1fcd1b505e8102039d537aa47-rootfs.mount: Deactivated successfully. Oct 28 00:31:20.232384 containerd[1625]: time="2025-10-28T00:31:20.232180840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Oct 28 00:31:21.146925 kubelet[2926]: E1028 00:31:21.146839 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:23.203050 kubelet[2926]: E1028 00:31:23.202913 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:24.019688 containerd[1625]: time="2025-10-28T00:31:24.019636065Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:24.020434 containerd[1625]: time="2025-10-28T00:31:24.020404553Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Oct 28 00:31:24.020797 containerd[1625]: time="2025-10-28T00:31:24.020776012Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:24.022453 containerd[1625]: time="2025-10-28T00:31:24.022427545Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:24.023084 containerd[1625]: time="2025-10-28T00:31:24.022730734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.790525248s" Oct 28 00:31:24.023084 containerd[1625]: time="2025-10-28T00:31:24.022747913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Oct 28 00:31:24.025254 containerd[1625]: time="2025-10-28T00:31:24.025226305Z" level=info msg="CreateContainer within sandbox \"e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Oct 28 00:31:24.033520 containerd[1625]: time="2025-10-28T00:31:24.032726384Z" level=info msg="Container 32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:24.035535 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2641330094.mount: Deactivated successfully. Oct 28 00:31:24.052637 containerd[1625]: time="2025-10-28T00:31:24.052604135Z" level=info msg="CreateContainer within sandbox \"e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11\"" Oct 28 00:31:24.054042 containerd[1625]: time="2025-10-28T00:31:24.053223301Z" level=info msg="StartContainer for \"32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11\"" Oct 28 00:31:24.055539 containerd[1625]: time="2025-10-28T00:31:24.055508662Z" level=info msg="connecting to shim 32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11" address="unix:///run/containerd/s/9c3487752fb07dd19f93bd2d9ff275fd350afb0bdbcb8f38dd4812c308a55666" protocol=ttrpc version=3 Oct 28 00:31:24.087819 systemd[1]: Started cri-containerd-32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11.scope - libcontainer container 32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11. Oct 28 00:31:24.206839 containerd[1625]: time="2025-10-28T00:31:24.206813474Z" level=info msg="StartContainer for \"32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11\" returns successfully" Oct 28 00:31:25.147692 kubelet[2926]: E1028 00:31:25.147602 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:26.528768 systemd[1]: cri-containerd-32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11.scope: Deactivated successfully. Oct 28 00:31:26.528985 systemd[1]: cri-containerd-32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11.scope: Consumed 350ms CPU time, 162.3M memory peak, 4.4M read from disk, 171.3M written to disk. Oct 28 00:31:26.542673 containerd[1625]: time="2025-10-28T00:31:26.542623728Z" level=info msg="received exit event container_id:\"32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11\" id:\"32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11\" pid:3692 exited_at:{seconds:1761611486 nanos:529641688}" Oct 28 00:31:26.543796 containerd[1625]: time="2025-10-28T00:31:26.542794494Z" level=info msg="TaskExit event in podsandbox handler container_id:\"32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11\" id:\"32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11\" pid:3692 exited_at:{seconds:1761611486 nanos:529641688}" Oct 28 00:31:26.632784 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-32dc7580e6a6480363050592e838c192198215671d35ec6a9c05d0e07d121d11-rootfs.mount: Deactivated successfully. Oct 28 00:31:26.653385 kubelet[2926]: I1028 00:31:26.653350 2926 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Oct 28 00:31:26.840133 systemd[1]: Created slice kubepods-besteffort-podab512388_5b28_4703_8a4e_dbe34005b6c3.slice - libcontainer container kubepods-besteffort-podab512388_5b28_4703_8a4e_dbe34005b6c3.slice. Oct 28 00:31:26.869825 kubelet[2926]: I1028 00:31:26.869744 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-backend-key-pair\") pod \"whisker-78cfdf5758-2h92h\" (UID: \"ab512388-5b28-4703-8a4e-dbe34005b6c3\") " pod="calico-system/whisker-78cfdf5758-2h92h" Oct 28 00:31:26.869825 kubelet[2926]: I1028 00:31:26.869771 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t5zc\" (UniqueName: \"kubernetes.io/projected/ab512388-5b28-4703-8a4e-dbe34005b6c3-kube-api-access-5t5zc\") pod \"whisker-78cfdf5758-2h92h\" (UID: \"ab512388-5b28-4703-8a4e-dbe34005b6c3\") " pod="calico-system/whisker-78cfdf5758-2h92h" Oct 28 00:31:26.869825 kubelet[2926]: I1028 00:31:26.869788 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-ca-bundle\") pod \"whisker-78cfdf5758-2h92h\" (UID: \"ab512388-5b28-4703-8a4e-dbe34005b6c3\") " pod="calico-system/whisker-78cfdf5758-2h92h" Oct 28 00:31:26.902427 systemd[1]: Created slice kubepods-burstable-pod961d689d_6f4c_4b9c_bdaf_12f5eeac421a.slice - libcontainer container kubepods-burstable-pod961d689d_6f4c_4b9c_bdaf_12f5eeac421a.slice. Oct 28 00:31:26.915397 systemd[1]: Created slice kubepods-burstable-pod410c3647_ec73_43b3_abf8_8d8ce60191b1.slice - libcontainer container kubepods-burstable-pod410c3647_ec73_43b3_abf8_8d8ce60191b1.slice. Oct 28 00:31:26.921034 systemd[1]: Created slice kubepods-besteffort-poda3d67cc9_a4af_4a25_892e_b5ffc390b89f.slice - libcontainer container kubepods-besteffort-poda3d67cc9_a4af_4a25_892e_b5ffc390b89f.slice. Oct 28 00:31:26.936022 systemd[1]: Created slice kubepods-besteffort-pod2c04f8c1_bc0d_4ee4_8a41_3a74344e8ece.slice - libcontainer container kubepods-besteffort-pod2c04f8c1_bc0d_4ee4_8a41_3a74344e8ece.slice. Oct 28 00:31:26.944286 systemd[1]: Created slice kubepods-besteffort-pod96642e2b_a099_4486_a4d3_2cc6f34eac9f.slice - libcontainer container kubepods-besteffort-pod96642e2b_a099_4486_a4d3_2cc6f34eac9f.slice. Oct 28 00:31:26.953678 systemd[1]: Created slice kubepods-besteffort-pod5296fefe_d676_448c_ac61_6435527489ed.slice - libcontainer container kubepods-besteffort-pod5296fefe_d676_448c_ac61_6435527489ed.slice. Oct 28 00:31:26.965879 systemd[1]: Created slice kubepods-besteffort-pod491a933d_8deb_47d2_a1f5_45928f657a21.slice - libcontainer container kubepods-besteffort-pod491a933d_8deb_47d2_a1f5_45928f657a21.slice. Oct 28 00:31:26.970742 kubelet[2926]: I1028 00:31:26.970721 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3d67cc9-a4af-4a25-892e-b5ffc390b89f-goldmane-ca-bundle\") pod \"goldmane-666569f655-x6j9p\" (UID: \"a3d67cc9-a4af-4a25-892e-b5ffc390b89f\") " pod="calico-system/goldmane-666569f655-x6j9p" Oct 28 00:31:26.970815 kubelet[2926]: I1028 00:31:26.970744 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece-calico-apiserver-certs\") pod \"calico-apiserver-64d5cc589f-kfc7q\" (UID: \"2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece\") " pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" Oct 28 00:31:26.970815 kubelet[2926]: I1028 00:31:26.970758 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j5lv\" (UniqueName: \"kubernetes.io/projected/5296fefe-d676-448c-ac61-6435527489ed-kube-api-access-9j5lv\") pod \"calico-apiserver-64fb65988f-xxdqr\" (UID: \"5296fefe-d676-448c-ac61-6435527489ed\") " pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" Oct 28 00:31:26.970815 kubelet[2926]: I1028 00:31:26.970812 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6blqb\" (UniqueName: \"kubernetes.io/projected/a3d67cc9-a4af-4a25-892e-b5ffc390b89f-kube-api-access-6blqb\") pod \"goldmane-666569f655-x6j9p\" (UID: \"a3d67cc9-a4af-4a25-892e-b5ffc390b89f\") " pod="calico-system/goldmane-666569f655-x6j9p" Oct 28 00:31:26.970900 kubelet[2926]: I1028 00:31:26.970824 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96642e2b-a099-4486-a4d3-2cc6f34eac9f-tigera-ca-bundle\") pod \"calico-kube-controllers-6f4bdd4695-jlv4b\" (UID: \"96642e2b-a099-4486-a4d3-2cc6f34eac9f\") " pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" Oct 28 00:31:26.970900 kubelet[2926]: I1028 00:31:26.970835 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjdbq\" (UniqueName: \"kubernetes.io/projected/96642e2b-a099-4486-a4d3-2cc6f34eac9f-kube-api-access-fjdbq\") pod \"calico-kube-controllers-6f4bdd4695-jlv4b\" (UID: \"96642e2b-a099-4486-a4d3-2cc6f34eac9f\") " pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" Oct 28 00:31:26.971240 kubelet[2926]: I1028 00:31:26.971223 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjp7s\" (UniqueName: \"kubernetes.io/projected/2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece-kube-api-access-xjp7s\") pod \"calico-apiserver-64d5cc589f-kfc7q\" (UID: \"2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece\") " pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" Oct 28 00:31:26.971279 kubelet[2926]: I1028 00:31:26.971250 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c7wd\" (UniqueName: \"kubernetes.io/projected/410c3647-ec73-43b3-abf8-8d8ce60191b1-kube-api-access-8c7wd\") pod \"coredns-674b8bbfcf-5h5lj\" (UID: \"410c3647-ec73-43b3-abf8-8d8ce60191b1\") " pod="kube-system/coredns-674b8bbfcf-5h5lj" Oct 28 00:31:26.971279 kubelet[2926]: I1028 00:31:26.971263 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5296fefe-d676-448c-ac61-6435527489ed-calico-apiserver-certs\") pod \"calico-apiserver-64fb65988f-xxdqr\" (UID: \"5296fefe-d676-448c-ac61-6435527489ed\") " pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" Oct 28 00:31:26.971279 kubelet[2926]: I1028 00:31:26.971275 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8k9d\" (UniqueName: \"kubernetes.io/projected/961d689d-6f4c-4b9c-bdaf-12f5eeac421a-kube-api-access-x8k9d\") pod \"coredns-674b8bbfcf-zbpq4\" (UID: \"961d689d-6f4c-4b9c-bdaf-12f5eeac421a\") " pod="kube-system/coredns-674b8bbfcf-zbpq4" Oct 28 00:31:26.971362 kubelet[2926]: I1028 00:31:26.971284 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a3d67cc9-a4af-4a25-892e-b5ffc390b89f-goldmane-key-pair\") pod \"goldmane-666569f655-x6j9p\" (UID: \"a3d67cc9-a4af-4a25-892e-b5ffc390b89f\") " pod="calico-system/goldmane-666569f655-x6j9p" Oct 28 00:31:26.971362 kubelet[2926]: I1028 00:31:26.971305 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/961d689d-6f4c-4b9c-bdaf-12f5eeac421a-config-volume\") pod \"coredns-674b8bbfcf-zbpq4\" (UID: \"961d689d-6f4c-4b9c-bdaf-12f5eeac421a\") " pod="kube-system/coredns-674b8bbfcf-zbpq4" Oct 28 00:31:26.971362 kubelet[2926]: I1028 00:31:26.971319 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/410c3647-ec73-43b3-abf8-8d8ce60191b1-config-volume\") pod \"coredns-674b8bbfcf-5h5lj\" (UID: \"410c3647-ec73-43b3-abf8-8d8ce60191b1\") " pod="kube-system/coredns-674b8bbfcf-5h5lj" Oct 28 00:31:26.971362 kubelet[2926]: I1028 00:31:26.971334 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d67cc9-a4af-4a25-892e-b5ffc390b89f-config\") pod \"goldmane-666569f655-x6j9p\" (UID: \"a3d67cc9-a4af-4a25-892e-b5ffc390b89f\") " pod="calico-system/goldmane-666569f655-x6j9p" Oct 28 00:31:27.072242 kubelet[2926]: I1028 00:31:27.072213 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/491a933d-8deb-47d2-a1f5-45928f657a21-calico-apiserver-certs\") pod \"calico-apiserver-64d5cc589f-jn72s\" (UID: \"491a933d-8deb-47d2-a1f5-45928f657a21\") " pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" Oct 28 00:31:27.072383 kubelet[2926]: I1028 00:31:27.072364 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmbdw\" (UniqueName: \"kubernetes.io/projected/491a933d-8deb-47d2-a1f5-45928f657a21-kube-api-access-tmbdw\") pod \"calico-apiserver-64d5cc589f-jn72s\" (UID: \"491a933d-8deb-47d2-a1f5-45928f657a21\") " pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" Oct 28 00:31:27.152507 containerd[1625]: time="2025-10-28T00:31:27.151926483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78cfdf5758-2h92h,Uid:ab512388-5b28-4703-8a4e-dbe34005b6c3,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:27.161698 systemd[1]: Created slice kubepods-besteffort-podbd5600ff_882d_4fd0_9a0a_4d9435b64027.slice - libcontainer container kubepods-besteffort-podbd5600ff_882d_4fd0_9a0a_4d9435b64027.slice. Oct 28 00:31:27.163598 containerd[1625]: time="2025-10-28T00:31:27.163549484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9vxm,Uid:bd5600ff-882d-4fd0-9a0a-4d9435b64027,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:27.207215 containerd[1625]: time="2025-10-28T00:31:27.207186352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zbpq4,Uid:961d689d-6f4c-4b9c-bdaf-12f5eeac421a,Namespace:kube-system,Attempt:0,}" Oct 28 00:31:27.221581 containerd[1625]: time="2025-10-28T00:31:27.221409375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5h5lj,Uid:410c3647-ec73-43b3-abf8-8d8ce60191b1,Namespace:kube-system,Attempt:0,}" Oct 28 00:31:27.222808 containerd[1625]: time="2025-10-28T00:31:27.222788476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-x6j9p,Uid:a3d67cc9-a4af-4a25-892e-b5ffc390b89f,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:27.240847 containerd[1625]: time="2025-10-28T00:31:27.240806079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-kfc7q,Uid:2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:31:27.248440 containerd[1625]: time="2025-10-28T00:31:27.248366820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f4bdd4695-jlv4b,Uid:96642e2b-a099-4486-a4d3-2cc6f34eac9f,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:27.258958 containerd[1625]: time="2025-10-28T00:31:27.258941432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64fb65988f-xxdqr,Uid:5296fefe-d676-448c-ac61-6435527489ed,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:31:27.268663 containerd[1625]: time="2025-10-28T00:31:27.268617537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-jn72s,Uid:491a933d-8deb-47d2-a1f5-45928f657a21,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:31:28.258910 containerd[1625]: time="2025-10-28T00:31:28.258832816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Oct 28 00:31:28.325966 containerd[1625]: time="2025-10-28T00:31:28.325923859Z" level=error msg="Failed to destroy network for sandbox \"8c522940ae900f52df63c1971cce9470509e8d1859164d1437da7f45a0d48b17\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.329474 systemd[1]: run-netns-cni\x2dc5179f50\x2dff71\x2d21ca\x2d5096\x2d30af83164f45.mount: Deactivated successfully. Oct 28 00:31:28.349893 containerd[1625]: time="2025-10-28T00:31:28.334397507Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-kfc7q,Uid:2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c522940ae900f52df63c1971cce9470509e8d1859164d1437da7f45a0d48b17\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.350034 containerd[1625]: time="2025-10-28T00:31:28.340148542Z" level=error msg="Failed to destroy network for sandbox \"0fa911a671ae334afb5bdee448d7b09ccfe826c69dc3b172bc160f9e516a422d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.352594 systemd[1]: run-netns-cni\x2df5821ce5\x2de3af\x2d3670\x2d9639\x2d340aade1fa94.mount: Deactivated successfully. Oct 28 00:31:28.357803 containerd[1625]: time="2025-10-28T00:31:28.357752359Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5h5lj,Uid:410c3647-ec73-43b3-abf8-8d8ce60191b1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0fa911a671ae334afb5bdee448d7b09ccfe826c69dc3b172bc160f9e516a422d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.360068 containerd[1625]: time="2025-10-28T00:31:28.341353174Z" level=error msg="Failed to destroy network for sandbox \"44b7c3a5613b7244fe1dfb938ef0f9eac93fc6c9a4fd717bdd6cde73f9008d6b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.363009 systemd[1]: run-netns-cni\x2d13801613\x2d841e\x2ded05\x2d7afe\x2d8f14ec4f3018.mount: Deactivated successfully. Oct 28 00:31:28.366512 containerd[1625]: time="2025-10-28T00:31:28.364122671Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-x6j9p,Uid:a3d67cc9-a4af-4a25-892e-b5ffc390b89f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"44b7c3a5613b7244fe1dfb938ef0f9eac93fc6c9a4fd717bdd6cde73f9008d6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.366512 containerd[1625]: time="2025-10-28T00:31:28.346663271Z" level=error msg="Failed to destroy network for sandbox \"50d74ce33eb0867a6e750589b9e844401f9d9bd620826042025996255ed90f8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.366512 containerd[1625]: time="2025-10-28T00:31:28.366267447Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f4bdd4695-jlv4b,Uid:96642e2b-a099-4486-a4d3-2cc6f34eac9f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"50d74ce33eb0867a6e750589b9e844401f9d9bd620826042025996255ed90f8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.365957 systemd[1]: run-netns-cni\x2ddcd371ee\x2d06f9\x2d42d6\x2df088\x2ddecaeddea7fe.mount: Deactivated successfully. Oct 28 00:31:28.368293 containerd[1625]: time="2025-10-28T00:31:28.367045903Z" level=error msg="Failed to destroy network for sandbox \"cd6997c1af391478a345c1755e899d0bb38c2e32daab5f12bc199525848cf960\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.368293 containerd[1625]: time="2025-10-28T00:31:28.367518175Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64fb65988f-xxdqr,Uid:5296fefe-d676-448c-ac61-6435527489ed,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd6997c1af391478a345c1755e899d0bb38c2e32daab5f12bc199525848cf960\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.368293 containerd[1625]: time="2025-10-28T00:31:28.367612939Z" level=error msg="Failed to destroy network for sandbox \"d1371f5bdba8d8e45f28ec9dcbcfab796a7b0cb698be343b66697b79b479fe29\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.368293 containerd[1625]: time="2025-10-28T00:31:28.367983403Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-jn72s,Uid:491a933d-8deb-47d2-a1f5-45928f657a21,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1371f5bdba8d8e45f28ec9dcbcfab796a7b0cb698be343b66697b79b479fe29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.368726 containerd[1625]: time="2025-10-28T00:31:28.368040427Z" level=error msg="Failed to destroy network for sandbox \"2c61d0c7697bf9366826625d7de4fedb4a65a648c31f28ed44d837f66a288ee4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.368726 containerd[1625]: time="2025-10-28T00:31:28.368409798Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zbpq4,Uid:961d689d-6f4c-4b9c-bdaf-12f5eeac421a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c61d0c7697bf9366826625d7de4fedb4a65a648c31f28ed44d837f66a288ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.368726 containerd[1625]: time="2025-10-28T00:31:28.368461342Z" level=error msg="Failed to destroy network for sandbox \"01a5c5292b842229bcb3eba8d726fb38b9b8f6a25bf5f50b86c6f3321cad87db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.369502 containerd[1625]: time="2025-10-28T00:31:28.368939043Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9vxm,Uid:bd5600ff-882d-4fd0-9a0a-4d9435b64027,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"01a5c5292b842229bcb3eba8d726fb38b9b8f6a25bf5f50b86c6f3321cad87db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.369557 kubelet[2926]: E1028 00:31:28.369232 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01a5c5292b842229bcb3eba8d726fb38b9b8f6a25bf5f50b86c6f3321cad87db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.369557 kubelet[2926]: E1028 00:31:28.369286 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01a5c5292b842229bcb3eba8d726fb38b9b8f6a25bf5f50b86c6f3321cad87db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:28.369557 kubelet[2926]: E1028 00:31:28.369307 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01a5c5292b842229bcb3eba8d726fb38b9b8f6a25bf5f50b86c6f3321cad87db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:28.370597 kubelet[2926]: E1028 00:31:28.369354 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01a5c5292b842229bcb3eba8d726fb38b9b8f6a25bf5f50b86c6f3321cad87db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:28.370597 kubelet[2926]: E1028 00:31:28.369927 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50d74ce33eb0867a6e750589b9e844401f9d9bd620826042025996255ed90f8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.370597 kubelet[2926]: E1028 00:31:28.369959 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50d74ce33eb0867a6e750589b9e844401f9d9bd620826042025996255ed90f8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" Oct 28 00:31:28.370901 kubelet[2926]: E1028 00:31:28.370062 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50d74ce33eb0867a6e750589b9e844401f9d9bd620826042025996255ed90f8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" Oct 28 00:31:28.370901 kubelet[2926]: E1028 00:31:28.370092 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6f4bdd4695-jlv4b_calico-system(96642e2b-a099-4486-a4d3-2cc6f34eac9f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6f4bdd4695-jlv4b_calico-system(96642e2b-a099-4486-a4d3-2cc6f34eac9f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"50d74ce33eb0867a6e750589b9e844401f9d9bd620826042025996255ed90f8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:31:28.370901 kubelet[2926]: E1028 00:31:28.370119 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c522940ae900f52df63c1971cce9470509e8d1859164d1437da7f45a0d48b17\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.371006 kubelet[2926]: E1028 00:31:28.370145 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c522940ae900f52df63c1971cce9470509e8d1859164d1437da7f45a0d48b17\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" Oct 28 00:31:28.371006 kubelet[2926]: E1028 00:31:28.370155 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c522940ae900f52df63c1971cce9470509e8d1859164d1437da7f45a0d48b17\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" Oct 28 00:31:28.371006 kubelet[2926]: E1028 00:31:28.370171 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64d5cc589f-kfc7q_calico-apiserver(2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64d5cc589f-kfc7q_calico-apiserver(2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c522940ae900f52df63c1971cce9470509e8d1859164d1437da7f45a0d48b17\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:31:28.371124 kubelet[2926]: E1028 00:31:28.370197 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0fa911a671ae334afb5bdee448d7b09ccfe826c69dc3b172bc160f9e516a422d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.371124 kubelet[2926]: E1028 00:31:28.370222 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0fa911a671ae334afb5bdee448d7b09ccfe826c69dc3b172bc160f9e516a422d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5h5lj" Oct 28 00:31:28.371124 kubelet[2926]: E1028 00:31:28.370232 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0fa911a671ae334afb5bdee448d7b09ccfe826c69dc3b172bc160f9e516a422d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5h5lj" Oct 28 00:31:28.371216 kubelet[2926]: E1028 00:31:28.370247 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-5h5lj_kube-system(410c3647-ec73-43b3-abf8-8d8ce60191b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-5h5lj_kube-system(410c3647-ec73-43b3-abf8-8d8ce60191b1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0fa911a671ae334afb5bdee448d7b09ccfe826c69dc3b172bc160f9e516a422d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-5h5lj" podUID="410c3647-ec73-43b3-abf8-8d8ce60191b1" Oct 28 00:31:28.371216 kubelet[2926]: E1028 00:31:28.370262 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44b7c3a5613b7244fe1dfb938ef0f9eac93fc6c9a4fd717bdd6cde73f9008d6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.371216 kubelet[2926]: E1028 00:31:28.370271 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44b7c3a5613b7244fe1dfb938ef0f9eac93fc6c9a4fd717bdd6cde73f9008d6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-x6j9p" Oct 28 00:31:28.371330 kubelet[2926]: E1028 00:31:28.370279 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44b7c3a5613b7244fe1dfb938ef0f9eac93fc6c9a4fd717bdd6cde73f9008d6b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-x6j9p" Oct 28 00:31:28.371330 kubelet[2926]: E1028 00:31:28.370386 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-x6j9p_calico-system(a3d67cc9-a4af-4a25-892e-b5ffc390b89f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-x6j9p_calico-system(a3d67cc9-a4af-4a25-892e-b5ffc390b89f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"44b7c3a5613b7244fe1dfb938ef0f9eac93fc6c9a4fd717bdd6cde73f9008d6b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:31:28.371330 kubelet[2926]: E1028 00:31:28.370405 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1371f5bdba8d8e45f28ec9dcbcfab796a7b0cb698be343b66697b79b479fe29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.371417 kubelet[2926]: E1028 00:31:28.370415 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1371f5bdba8d8e45f28ec9dcbcfab796a7b0cb698be343b66697b79b479fe29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" Oct 28 00:31:28.371417 kubelet[2926]: E1028 00:31:28.370423 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1371f5bdba8d8e45f28ec9dcbcfab796a7b0cb698be343b66697b79b479fe29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" Oct 28 00:31:28.371417 kubelet[2926]: E1028 00:31:28.370451 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64d5cc589f-jn72s_calico-apiserver(491a933d-8deb-47d2-a1f5-45928f657a21)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64d5cc589f-jn72s_calico-apiserver(491a933d-8deb-47d2-a1f5-45928f657a21)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1371f5bdba8d8e45f28ec9dcbcfab796a7b0cb698be343b66697b79b479fe29\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:31:28.371497 kubelet[2926]: E1028 00:31:28.370474 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd6997c1af391478a345c1755e899d0bb38c2e32daab5f12bc199525848cf960\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.371497 kubelet[2926]: E1028 00:31:28.370485 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd6997c1af391478a345c1755e899d0bb38c2e32daab5f12bc199525848cf960\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" Oct 28 00:31:28.371497 kubelet[2926]: E1028 00:31:28.370493 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd6997c1af391478a345c1755e899d0bb38c2e32daab5f12bc199525848cf960\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" Oct 28 00:31:28.371552 kubelet[2926]: E1028 00:31:28.370509 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64fb65988f-xxdqr_calico-apiserver(5296fefe-d676-448c-ac61-6435527489ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64fb65988f-xxdqr_calico-apiserver(5296fefe-d676-448c-ac61-6435527489ed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cd6997c1af391478a345c1755e899d0bb38c2e32daab5f12bc199525848cf960\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:31:28.371552 kubelet[2926]: E1028 00:31:28.370537 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c61d0c7697bf9366826625d7de4fedb4a65a648c31f28ed44d837f66a288ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.371552 kubelet[2926]: E1028 00:31:28.370548 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c61d0c7697bf9366826625d7de4fedb4a65a648c31f28ed44d837f66a288ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zbpq4" Oct 28 00:31:28.372021 kubelet[2926]: E1028 00:31:28.370554 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c61d0c7697bf9366826625d7de4fedb4a65a648c31f28ed44d837f66a288ee4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zbpq4" Oct 28 00:31:28.372021 kubelet[2926]: E1028 00:31:28.370568 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-zbpq4_kube-system(961d689d-6f4c-4b9c-bdaf-12f5eeac421a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-zbpq4_kube-system(961d689d-6f4c-4b9c-bdaf-12f5eeac421a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c61d0c7697bf9366826625d7de4fedb4a65a648c31f28ed44d837f66a288ee4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-zbpq4" podUID="961d689d-6f4c-4b9c-bdaf-12f5eeac421a" Oct 28 00:31:28.372814 containerd[1625]: time="2025-10-28T00:31:28.372783076Z" level=error msg="Failed to destroy network for sandbox \"cdb3a5042abf70bd5051de5ac9c2789abad8132681f1266a303ba9543d670e57\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.374735 containerd[1625]: time="2025-10-28T00:31:28.374678010Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78cfdf5758-2h92h,Uid:ab512388-5b28-4703-8a4e-dbe34005b6c3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb3a5042abf70bd5051de5ac9c2789abad8132681f1266a303ba9543d670e57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.375393 kubelet[2926]: E1028 00:31:28.375315 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb3a5042abf70bd5051de5ac9c2789abad8132681f1266a303ba9543d670e57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:28.375488 kubelet[2926]: E1028 00:31:28.375475 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb3a5042abf70bd5051de5ac9c2789abad8132681f1266a303ba9543d670e57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78cfdf5758-2h92h" Oct 28 00:31:28.375543 kubelet[2926]: E1028 00:31:28.375535 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb3a5042abf70bd5051de5ac9c2789abad8132681f1266a303ba9543d670e57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78cfdf5758-2h92h" Oct 28 00:31:28.375719 kubelet[2926]: E1028 00:31:28.375676 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-78cfdf5758-2h92h_calico-system(ab512388-5b28-4703-8a4e-dbe34005b6c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-78cfdf5758-2h92h_calico-system(ab512388-5b28-4703-8a4e-dbe34005b6c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cdb3a5042abf70bd5051de5ac9c2789abad8132681f1266a303ba9543d670e57\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-78cfdf5758-2h92h" podUID="ab512388-5b28-4703-8a4e-dbe34005b6c3" Oct 28 00:31:28.632927 systemd[1]: run-netns-cni\x2d8f4be3ab\x2d973b\x2d72cf\x2db2f2\x2dad4ee111b96c.mount: Deactivated successfully. Oct 28 00:31:28.633004 systemd[1]: run-netns-cni\x2d7acba666\x2d483a\x2d2b7d\x2d19b8\x2d208364a38576.mount: Deactivated successfully. Oct 28 00:31:28.633055 systemd[1]: run-netns-cni\x2daca80974\x2d86b0\x2d4adb\x2decfc\x2d38601befb44f.mount: Deactivated successfully. Oct 28 00:31:28.633097 systemd[1]: run-netns-cni\x2df28e76df\x2d16fa\x2d9c20\x2da490\x2dba8b76dfc179.mount: Deactivated successfully. Oct 28 00:31:28.633146 systemd[1]: run-netns-cni\x2d4ac73474\x2d14c9\x2dc5cc\x2d4fbd\x2d87ac23295ee2.mount: Deactivated successfully. Oct 28 00:31:35.256274 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2631238852.mount: Deactivated successfully. Oct 28 00:31:36.022730 containerd[1625]: time="2025-10-28T00:31:36.022602363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:36.514722 containerd[1625]: time="2025-10-28T00:31:36.514677837Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Oct 28 00:31:36.526381 containerd[1625]: time="2025-10-28T00:31:36.525379229Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:36.537551 containerd[1625]: time="2025-10-28T00:31:36.537513389Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:31:36.540287 containerd[1625]: time="2025-10-28T00:31:36.540266178Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 8.279390139s" Oct 28 00:31:36.540390 containerd[1625]: time="2025-10-28T00:31:36.540378324Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Oct 28 00:31:36.956700 containerd[1625]: time="2025-10-28T00:31:36.956670822Z" level=info msg="CreateContainer within sandbox \"e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Oct 28 00:31:37.217388 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3296054790.mount: Deactivated successfully. Oct 28 00:31:37.217807 containerd[1625]: time="2025-10-28T00:31:37.217732333Z" level=info msg="Container 8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:37.389580 containerd[1625]: time="2025-10-28T00:31:37.389545806Z" level=info msg="CreateContainer within sandbox \"e6d9ed0e6cbc36b3af20ae730cf348a32f3ac779035e4248ac67ae0fe5bcc549\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\"" Oct 28 00:31:37.390424 containerd[1625]: time="2025-10-28T00:31:37.389945804Z" level=info msg="StartContainer for \"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\"" Oct 28 00:31:37.423775 containerd[1625]: time="2025-10-28T00:31:37.423739651Z" level=info msg="connecting to shim 8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f" address="unix:///run/containerd/s/9c3487752fb07dd19f93bd2d9ff275fd350afb0bdbcb8f38dd4812c308a55666" protocol=ttrpc version=3 Oct 28 00:31:37.778726 systemd[1]: Started cri-containerd-8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f.scope - libcontainer container 8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f. Oct 28 00:31:37.866671 containerd[1625]: time="2025-10-28T00:31:37.866645848Z" level=info msg="StartContainer for \"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\" returns successfully" Oct 28 00:31:38.858624 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Oct 28 00:31:38.916666 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Oct 28 00:31:38.935858 containerd[1625]: time="2025-10-28T00:31:38.935831016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\" id:\"bcf8aed4c0612c907cebb3502e09285eeb21fd7228f7e657de99c7e6798b1bd8\" pid:4023 exit_status:1 exited_at:{seconds:1761611498 nanos:935448640}" Oct 28 00:31:39.147243 containerd[1625]: time="2025-10-28T00:31:39.147180080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-jn72s,Uid:491a933d-8deb-47d2-a1f5-45928f657a21,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:31:39.147455 containerd[1625]: time="2025-10-28T00:31:39.147361219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9vxm,Uid:bd5600ff-882d-4fd0-9a0a-4d9435b64027,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:39.255976 containerd[1625]: time="2025-10-28T00:31:39.255943052Z" level=error msg="Failed to destroy network for sandbox \"f5e5394c4cabd11126792ae337417b30587af62135fd834f5dc4603da66c0197\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:39.257320 systemd[1]: run-netns-cni\x2d0acfe9f2\x2df179\x2d96ac\x2d3594\x2da207fd2e2060.mount: Deactivated successfully. Oct 28 00:31:39.325589 containerd[1625]: time="2025-10-28T00:31:39.325531920Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-jn72s,Uid:491a933d-8deb-47d2-a1f5-45928f657a21,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e5394c4cabd11126792ae337417b30587af62135fd834f5dc4603da66c0197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:39.332707 containerd[1625]: time="2025-10-28T00:31:39.330506003Z" level=error msg="Failed to destroy network for sandbox \"587df91a9afd2d0210f3801429d0a3bb2a723a37bdcff97bc77c0f0c8d8479a1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:39.331846 systemd[1]: run-netns-cni\x2dee68dec5\x2d56d9\x2d0506\x2dae79\x2d4ef7821d8bed.mount: Deactivated successfully. Oct 28 00:31:39.374786 kubelet[2926]: E1028 00:31:39.325812 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e5394c4cabd11126792ae337417b30587af62135fd834f5dc4603da66c0197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:39.374786 kubelet[2926]: E1028 00:31:39.325863 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e5394c4cabd11126792ae337417b30587af62135fd834f5dc4603da66c0197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" Oct 28 00:31:39.374786 kubelet[2926]: E1028 00:31:39.325877 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5e5394c4cabd11126792ae337417b30587af62135fd834f5dc4603da66c0197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" Oct 28 00:31:39.375051 kubelet[2926]: E1028 00:31:39.325941 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64d5cc589f-jn72s_calico-apiserver(491a933d-8deb-47d2-a1f5-45928f657a21)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64d5cc589f-jn72s_calico-apiserver(491a933d-8deb-47d2-a1f5-45928f657a21)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5e5394c4cabd11126792ae337417b30587af62135fd834f5dc4603da66c0197\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:31:39.394013 containerd[1625]: time="2025-10-28T00:31:39.393939983Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\" id:\"7d396169096e75d6d2ccd71b7dec9dbaf9f5cc6bad5919a98f0b56c105e28292\" pid:4093 exit_status:1 exited_at:{seconds:1761611499 nanos:393668469}" Oct 28 00:31:39.408936 containerd[1625]: time="2025-10-28T00:31:39.408796960Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9vxm,Uid:bd5600ff-882d-4fd0-9a0a-4d9435b64027,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"587df91a9afd2d0210f3801429d0a3bb2a723a37bdcff97bc77c0f0c8d8479a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:39.409037 kubelet[2926]: E1028 00:31:39.408988 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"587df91a9afd2d0210f3801429d0a3bb2a723a37bdcff97bc77c0f0c8d8479a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:31:39.409071 kubelet[2926]: E1028 00:31:39.409041 2926 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"587df91a9afd2d0210f3801429d0a3bb2a723a37bdcff97bc77c0f0c8d8479a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:39.409071 kubelet[2926]: E1028 00:31:39.409057 2926 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"587df91a9afd2d0210f3801429d0a3bb2a723a37bdcff97bc77c0f0c8d8479a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p9vxm" Oct 28 00:31:39.409118 kubelet[2926]: E1028 00:31:39.409087 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"587df91a9afd2d0210f3801429d0a3bb2a723a37bdcff97bc77c0f0c8d8479a1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:41.148557 containerd[1625]: time="2025-10-28T00:31:41.148527869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zbpq4,Uid:961d689d-6f4c-4b9c-bdaf-12f5eeac421a,Namespace:kube-system,Attempt:0,}" Oct 28 00:31:41.149441 containerd[1625]: time="2025-10-28T00:31:41.149006576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-kfc7q,Uid:2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:31:41.149587 containerd[1625]: time="2025-10-28T00:31:41.149041746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f4bdd4695-jlv4b,Uid:96642e2b-a099-4486-a4d3-2cc6f34eac9f,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:41.149786 containerd[1625]: time="2025-10-28T00:31:41.149075236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64fb65988f-xxdqr,Uid:5296fefe-d676-448c-ac61-6435527489ed,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:31:41.149786 containerd[1625]: time="2025-10-28T00:31:41.149094612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-x6j9p,Uid:a3d67cc9-a4af-4a25-892e-b5ffc390b89f,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:41.590341 kubelet[2926]: I1028 00:31:41.590133 2926 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 28 00:31:42.563917 containerd[1625]: time="2025-10-28T00:31:42.563808528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78cfdf5758-2h92h,Uid:ab512388-5b28-4703-8a4e-dbe34005b6c3,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:42.704412 kubelet[2926]: I1028 00:31:42.703960 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-f4gjv" podStartSLOduration=7.400763196 podStartE2EDuration="28.703939988s" podCreationTimestamp="2025-10-28 00:31:14 +0000 UTC" firstStartedPulling="2025-10-28 00:31:15.237662693 +0000 UTC m=+21.176384248" lastFinishedPulling="2025-10-28 00:31:36.540839483 +0000 UTC m=+42.479561040" observedRunningTime="2025-10-28 00:31:38.310035399 +0000 UTC m=+44.248756965" watchObservedRunningTime="2025-10-28 00:31:42.703939988 +0000 UTC m=+48.642661553" Oct 28 00:31:43.148328 containerd[1625]: time="2025-10-28T00:31:43.148264990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5h5lj,Uid:410c3647-ec73-43b3-abf8-8d8ce60191b1,Namespace:kube-system,Attempt:0,}" Oct 28 00:31:43.605673 systemd-networkd[1294]: califea0d53fae9: Link UP Oct 28 00:31:43.605792 systemd-networkd[1294]: califea0d53fae9: Gained carrier Oct 28 00:31:43.638849 containerd[1625]: 2025-10-28 00:31:41.274 [INFO][4147] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 28 00:31:43.638849 containerd[1625]: 2025-10-28 00:31:41.909 [INFO][4147] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0 calico-apiserver-64d5cc589f- calico-apiserver 2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece 836 0 2025-10-28 00:31:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64d5cc589f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64d5cc589f-kfc7q eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califea0d53fae9 [] [] }} ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-" Oct 28 00:31:43.638849 containerd[1625]: 2025-10-28 00:31:41.909 [INFO][4147] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" Oct 28 00:31:43.638849 containerd[1625]: 2025-10-28 00:31:43.488 [INFO][4289] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" HandleID="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Workload="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4289] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" HandleID="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Workload="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ae220), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64d5cc589f-kfc7q", "timestamp":"2025-10-28 00:31:43.488837069 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4289] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4289] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.490 [INFO][4289] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.517 [INFO][4289] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" host="localhost" Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.542 [INFO][4289] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.548 [INFO][4289] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.550 [INFO][4289] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.553 [INFO][4289] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:43.639101 containerd[1625]: 2025-10-28 00:31:43.553 [INFO][4289] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" host="localhost" Oct 28 00:31:43.641843 containerd[1625]: 2025-10-28 00:31:43.554 [INFO][4289] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01 Oct 28 00:31:43.641843 containerd[1625]: 2025-10-28 00:31:43.557 [INFO][4289] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" host="localhost" Oct 28 00:31:43.641843 containerd[1625]: 2025-10-28 00:31:43.565 [INFO][4289] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" host="localhost" Oct 28 00:31:43.641843 containerd[1625]: 2025-10-28 00:31:43.565 [INFO][4289] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" host="localhost" Oct 28 00:31:43.641843 containerd[1625]: 2025-10-28 00:31:43.565 [INFO][4289] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:43.641843 containerd[1625]: 2025-10-28 00:31:43.565 [INFO][4289] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" HandleID="k8s-pod-network.9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Workload="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" Oct 28 00:31:43.641990 containerd[1625]: 2025-10-28 00:31:43.570 [INFO][4147] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0", GenerateName:"calico-apiserver-64d5cc589f-", Namespace:"calico-apiserver", SelfLink:"", UID:"2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d5cc589f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64d5cc589f-kfc7q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califea0d53fae9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:43.642058 containerd[1625]: 2025-10-28 00:31:43.570 [INFO][4147] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" Oct 28 00:31:43.642058 containerd[1625]: 2025-10-28 00:31:43.570 [INFO][4147] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califea0d53fae9 ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" Oct 28 00:31:43.642058 containerd[1625]: 2025-10-28 00:31:43.611 [INFO][4147] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" Oct 28 00:31:43.642149 containerd[1625]: 2025-10-28 00:31:43.611 [INFO][4147] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0", GenerateName:"calico-apiserver-64d5cc589f-", Namespace:"calico-apiserver", SelfLink:"", UID:"2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d5cc589f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01", Pod:"calico-apiserver-64d5cc589f-kfc7q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califea0d53fae9", MAC:"26:5e:ac:e0:d3:c6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:43.642206 containerd[1625]: 2025-10-28 00:31:43.628 [INFO][4147] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-kfc7q" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--kfc7q-eth0" Oct 28 00:31:43.690860 systemd-networkd[1294]: cali8c38614ded7: Link UP Oct 28 00:31:43.692735 systemd-networkd[1294]: cali8c38614ded7: Gained carrier Oct 28 00:31:43.793965 containerd[1625]: 2025-10-28 00:31:41.310 [INFO][4157] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 28 00:31:43.793965 containerd[1625]: 2025-10-28 00:31:41.910 [INFO][4157] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0 calico-apiserver-64fb65988f- calico-apiserver 5296fefe-d676-448c-ac61-6435527489ed 835 0 2025-10-28 00:31:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64fb65988f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64fb65988f-xxdqr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8c38614ded7 [] [] }} ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-" Oct 28 00:31:43.793965 containerd[1625]: 2025-10-28 00:31:41.910 [INFO][4157] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" Oct 28 00:31:43.793965 containerd[1625]: 2025-10-28 00:31:43.488 [INFO][4286] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" HandleID="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Workload="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4286] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" HandleID="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Workload="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003099d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64fb65988f-xxdqr", "timestamp":"2025-10-28 00:31:43.488848029 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4286] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.565 [INFO][4286] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.565 [INFO][4286] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.616 [INFO][4286] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" host="localhost" Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.630 [INFO][4286] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.649 [INFO][4286] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.650 [INFO][4286] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.653 [INFO][4286] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:43.808608 containerd[1625]: 2025-10-28 00:31:43.653 [INFO][4286] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" host="localhost" Oct 28 00:31:43.808797 containerd[1625]: 2025-10-28 00:31:43.654 [INFO][4286] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306 Oct 28 00:31:43.808797 containerd[1625]: 2025-10-28 00:31:43.663 [INFO][4286] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" host="localhost" Oct 28 00:31:43.808797 containerd[1625]: 2025-10-28 00:31:43.677 [INFO][4286] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" host="localhost" Oct 28 00:31:43.808797 containerd[1625]: 2025-10-28 00:31:43.677 [INFO][4286] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" host="localhost" Oct 28 00:31:43.808797 containerd[1625]: 2025-10-28 00:31:43.677 [INFO][4286] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:43.808797 containerd[1625]: 2025-10-28 00:31:43.677 [INFO][4286] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" HandleID="k8s-pod-network.d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Workload="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" Oct 28 00:31:43.808907 containerd[1625]: 2025-10-28 00:31:43.682 [INFO][4157] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0", GenerateName:"calico-apiserver-64fb65988f-", Namespace:"calico-apiserver", SelfLink:"", UID:"5296fefe-d676-448c-ac61-6435527489ed", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64fb65988f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64fb65988f-xxdqr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8c38614ded7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:43.814096 containerd[1625]: 2025-10-28 00:31:43.682 [INFO][4157] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" Oct 28 00:31:43.814096 containerd[1625]: 2025-10-28 00:31:43.682 [INFO][4157] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8c38614ded7 ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" Oct 28 00:31:43.814096 containerd[1625]: 2025-10-28 00:31:43.693 [INFO][4157] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" Oct 28 00:31:43.814164 containerd[1625]: 2025-10-28 00:31:43.694 [INFO][4157] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0", GenerateName:"calico-apiserver-64fb65988f-", Namespace:"calico-apiserver", SelfLink:"", UID:"5296fefe-d676-448c-ac61-6435527489ed", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64fb65988f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306", Pod:"calico-apiserver-64fb65988f-xxdqr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8c38614ded7", MAC:"fe:1c:18:ef:f7:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:43.814205 containerd[1625]: 2025-10-28 00:31:43.791 [INFO][4157] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" Namespace="calico-apiserver" Pod="calico-apiserver-64fb65988f-xxdqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--64fb65988f--xxdqr-eth0" Oct 28 00:31:43.856666 systemd-networkd[1294]: cali0249cfc6d03: Link UP Oct 28 00:31:43.856905 systemd-networkd[1294]: cali0249cfc6d03: Gained carrier Oct 28 00:31:43.872305 containerd[1625]: 2025-10-28 00:31:41.258 [INFO][4137] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 28 00:31:43.872305 containerd[1625]: 2025-10-28 00:31:41.899 [INFO][4137] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0 coredns-674b8bbfcf- kube-system 961d689d-6f4c-4b9c-bdaf-12f5eeac421a 831 0 2025-10-28 00:31:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-zbpq4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0249cfc6d03 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-" Oct 28 00:31:43.872305 containerd[1625]: 2025-10-28 00:31:41.899 [INFO][4137] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" Oct 28 00:31:43.872305 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4285] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" HandleID="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Workload="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4285] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" HandleID="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Workload="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003217b0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-zbpq4", "timestamp":"2025-10-28 00:31:43.489189367 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.489 [INFO][4285] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.678 [INFO][4285] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.678 [INFO][4285] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.789 [INFO][4285] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" host="localhost" Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.796 [INFO][4285] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.803 [INFO][4285] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.805 [INFO][4285] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.806 [INFO][4285] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:43.872765 containerd[1625]: 2025-10-28 00:31:43.807 [INFO][4285] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" host="localhost" Oct 28 00:31:43.874186 containerd[1625]: 2025-10-28 00:31:43.807 [INFO][4285] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc Oct 28 00:31:43.874186 containerd[1625]: 2025-10-28 00:31:43.818 [INFO][4285] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" host="localhost" Oct 28 00:31:43.874186 containerd[1625]: 2025-10-28 00:31:43.834 [INFO][4285] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" host="localhost" Oct 28 00:31:43.874186 containerd[1625]: 2025-10-28 00:31:43.834 [INFO][4285] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" host="localhost" Oct 28 00:31:43.874186 containerd[1625]: 2025-10-28 00:31:43.834 [INFO][4285] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:43.874186 containerd[1625]: 2025-10-28 00:31:43.834 [INFO][4285] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" HandleID="k8s-pod-network.9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Workload="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" Oct 28 00:31:43.874352 containerd[1625]: 2025-10-28 00:31:43.837 [INFO][4137] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"961d689d-6f4c-4b9c-bdaf-12f5eeac421a", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-zbpq4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0249cfc6d03", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:43.874443 containerd[1625]: 2025-10-28 00:31:43.838 [INFO][4137] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" Oct 28 00:31:43.874443 containerd[1625]: 2025-10-28 00:31:43.838 [INFO][4137] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0249cfc6d03 ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" Oct 28 00:31:43.874443 containerd[1625]: 2025-10-28 00:31:43.843 [INFO][4137] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" Oct 28 00:31:43.874539 containerd[1625]: 2025-10-28 00:31:43.843 [INFO][4137] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"961d689d-6f4c-4b9c-bdaf-12f5eeac421a", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc", Pod:"coredns-674b8bbfcf-zbpq4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0249cfc6d03", MAC:"2e:c9:a7:a0:63:b3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:43.874539 containerd[1625]: 2025-10-28 00:31:43.864 [INFO][4137] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" Namespace="kube-system" Pod="coredns-674b8bbfcf-zbpq4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--zbpq4-eth0" Oct 28 00:31:43.965713 systemd-networkd[1294]: cali2c0b32ad188: Link UP Oct 28 00:31:43.974948 systemd-networkd[1294]: cali2c0b32ad188: Gained carrier Oct 28 00:31:44.003182 containerd[1625]: time="2025-10-28T00:31:44.003139344Z" level=info msg="connecting to shim d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306" address="unix:///run/containerd/s/c56af476fbf111a2fe1219a2eabf043e007b7aad81fdfd4d967e487d24a04abf" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:44.003522 containerd[1625]: time="2025-10-28T00:31:44.003480053Z" level=info msg="connecting to shim 9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01" address="unix:///run/containerd/s/b0b56a65f8d35f7363dd44c4f455597eb345f8836964639cfede2a5a7ed75d24" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:44.015691 containerd[1625]: time="2025-10-28T00:31:44.015657426Z" level=info msg="connecting to shim 9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc" address="unix:///run/containerd/s/c0807d39949c7923394261b5b542efb818a01f55e26f13ae57f9490771eecee6" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:41.392 [INFO][4181] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:41.909 [INFO][4181] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0 calico-kube-controllers-6f4bdd4695- calico-system 96642e2b-a099-4486-a4d3-2cc6f34eac9f 834 0 2025-10-28 00:31:15 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6f4bdd4695 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6f4bdd4695-jlv4b eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2c0b32ad188 [] [] }} ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:41.909 [INFO][4181] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.488 [INFO][4282] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" HandleID="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Workload="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.490 [INFO][4282] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" HandleID="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Workload="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003f6be0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6f4bdd4695-jlv4b", "timestamp":"2025-10-28 00:31:43.488551049 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.490 [INFO][4282] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.834 [INFO][4282] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.834 [INFO][4282] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.841 [INFO][4282] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.902 [INFO][4282] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.915 [INFO][4282] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.920 [INFO][4282] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.926 [INFO][4282] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.926 [INFO][4282] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.932 [INFO][4282] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.939 [INFO][4282] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.948 [INFO][4282] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.948 [INFO][4282] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" host="localhost" Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.949 [INFO][4282] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:44.029353 containerd[1625]: 2025-10-28 00:31:43.949 [INFO][4282] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" HandleID="k8s-pod-network.0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Workload="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" Oct 28 00:31:44.030195 containerd[1625]: 2025-10-28 00:31:43.957 [INFO][4181] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0", GenerateName:"calico-kube-controllers-6f4bdd4695-", Namespace:"calico-system", SelfLink:"", UID:"96642e2b-a099-4486-a4d3-2cc6f34eac9f", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f4bdd4695", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6f4bdd4695-jlv4b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2c0b32ad188", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.030195 containerd[1625]: 2025-10-28 00:31:43.957 [INFO][4181] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" Oct 28 00:31:44.030195 containerd[1625]: 2025-10-28 00:31:43.957 [INFO][4181] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2c0b32ad188 ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" Oct 28 00:31:44.030195 containerd[1625]: 2025-10-28 00:31:43.974 [INFO][4181] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" Oct 28 00:31:44.030195 containerd[1625]: 2025-10-28 00:31:43.975 [INFO][4181] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0", GenerateName:"calico-kube-controllers-6f4bdd4695-", Namespace:"calico-system", SelfLink:"", UID:"96642e2b-a099-4486-a4d3-2cc6f34eac9f", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f4bdd4695", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c", Pod:"calico-kube-controllers-6f4bdd4695-jlv4b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2c0b32ad188", MAC:"1e:57:da:6a:52:9c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.030195 containerd[1625]: 2025-10-28 00:31:44.025 [INFO][4181] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" Namespace="calico-system" Pod="calico-kube-controllers-6f4bdd4695-jlv4b" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f4bdd4695--jlv4b-eth0" Oct 28 00:31:44.051689 systemd[1]: Started cri-containerd-9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01.scope - libcontainer container 9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01. Oct 28 00:31:44.055825 systemd[1]: Started cri-containerd-9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc.scope - libcontainer container 9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc. Oct 28 00:31:44.057662 systemd[1]: Started cri-containerd-d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306.scope - libcontainer container d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306. Oct 28 00:31:44.071092 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:44.071441 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:44.089279 systemd-networkd[1294]: cali60cf45501f6: Link UP Oct 28 00:31:44.089532 systemd-networkd[1294]: cali60cf45501f6: Gained carrier Oct 28 00:31:44.107820 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:41.350 [INFO][4168] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:41.909 [INFO][4168] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--666569f655--x6j9p-eth0 goldmane-666569f655- calico-system a3d67cc9-a4af-4a25-892e-b5ffc390b89f 833 0 2025-10-28 00:31:13 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-666569f655-x6j9p eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali60cf45501f6 [] [] }} ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:41.909 [INFO][4168] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-eth0" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:43.488 [INFO][4287] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" HandleID="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Workload="localhost-k8s-goldmane--666569f655--x6j9p-eth0" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:43.490 [INFO][4287] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" HandleID="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Workload="localhost-k8s-goldmane--666569f655--x6j9p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000350a90), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-666569f655-x6j9p", "timestamp":"2025-10-28 00:31:43.488564502 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:43.490 [INFO][4287] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:43.949 [INFO][4287] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:43.949 [INFO][4287] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:43.978 [INFO][4287] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.029 [INFO][4287] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.034 [INFO][4287] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.040 [INFO][4287] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.044 [INFO][4287] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.045 [INFO][4287] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.048 [INFO][4287] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80 Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.059 [INFO][4287] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.080 [INFO][4287] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.080 [INFO][4287] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" host="localhost" Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.080 [INFO][4287] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:44.128553 containerd[1625]: 2025-10-28 00:31:44.080 [INFO][4287] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" HandleID="k8s-pod-network.e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Workload="localhost-k8s-goldmane--666569f655--x6j9p-eth0" Oct 28 00:31:44.142828 containerd[1625]: 2025-10-28 00:31:44.082 [INFO][4168] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--x6j9p-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"a3d67cc9-a4af-4a25-892e-b5ffc390b89f", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-666569f655-x6j9p", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali60cf45501f6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.142828 containerd[1625]: 2025-10-28 00:31:44.082 [INFO][4168] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-eth0" Oct 28 00:31:44.142828 containerd[1625]: 2025-10-28 00:31:44.082 [INFO][4168] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60cf45501f6 ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-eth0" Oct 28 00:31:44.142828 containerd[1625]: 2025-10-28 00:31:44.089 [INFO][4168] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-eth0" Oct 28 00:31:44.142828 containerd[1625]: 2025-10-28 00:31:44.094 [INFO][4168] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--x6j9p-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"a3d67cc9-a4af-4a25-892e-b5ffc390b89f", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80", Pod:"goldmane-666569f655-x6j9p", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali60cf45501f6", MAC:"a2:fb:87:e2:7f:00", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.142828 containerd[1625]: 2025-10-28 00:31:44.117 [INFO][4168] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" Namespace="calico-system" Pod="goldmane-666569f655-x6j9p" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--x6j9p-eth0" Oct 28 00:31:44.183664 containerd[1625]: time="2025-10-28T00:31:44.183558210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zbpq4,Uid:961d689d-6f4c-4b9c-bdaf-12f5eeac421a,Namespace:kube-system,Attempt:0,} returns sandbox id \"9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc\"" Oct 28 00:31:44.186938 containerd[1625]: time="2025-10-28T00:31:44.186912038Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64fb65988f-xxdqr,Uid:5296fefe-d676-448c-ac61-6435527489ed,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d20ed974c78466e50b9d28684ba535897675e665d132dcfe2f2fa3a403fff306\"" Oct 28 00:31:44.187894 containerd[1625]: time="2025-10-28T00:31:44.187821566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-kfc7q,Uid:2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9efab8211df02c5717674d5cbdba93b1159cfdfbf3d8060ceb82767ce961cb01\"" Oct 28 00:31:44.210534 containerd[1625]: time="2025-10-28T00:31:44.209892010Z" level=info msg="connecting to shim 0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c" address="unix:///run/containerd/s/ddbf3144b72e9864cedc27377f28f00d45999967caab98a6fb82ab043ae70158" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:44.210398 systemd-networkd[1294]: vxlan.calico: Link UP Oct 28 00:31:44.210402 systemd-networkd[1294]: vxlan.calico: Gained carrier Oct 28 00:31:44.238283 containerd[1625]: time="2025-10-28T00:31:44.238002945Z" level=info msg="connecting to shim e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80" address="unix:///run/containerd/s/506798b06717a0277a279923ef044247f3d9aeaba106e935f52286758c58fcd0" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:44.263723 systemd[1]: Started cri-containerd-0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c.scope - libcontainer container 0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c. Oct 28 00:31:44.277759 systemd[1]: Started cri-containerd-e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80.scope - libcontainer container e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80. Oct 28 00:31:44.289858 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:44.300942 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:44.349785 systemd-networkd[1294]: cali9305cbb1b89: Link UP Oct 28 00:31:44.351238 systemd-networkd[1294]: cali9305cbb1b89: Gained carrier Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.206 [INFO][4564] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--78cfdf5758--2h92h-eth0 whisker-78cfdf5758- calico-system ab512388-5b28-4703-8a4e-dbe34005b6c3 898 0 2025-10-28 00:31:17 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78cfdf5758 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-78cfdf5758-2h92h eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9305cbb1b89 [] [] }} ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.207 [INFO][4564] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.260 [INFO][4622] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.260 [INFO][4622] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f680), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-78cfdf5758-2h92h", "timestamp":"2025-10-28 00:31:44.26072346 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.260 [INFO][4622] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.260 [INFO][4622] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.260 [INFO][4622] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.268 [INFO][4622] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.274 [INFO][4622] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.276 [INFO][4622] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.277 [INFO][4622] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.279 [INFO][4622] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.279 [INFO][4622] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.282 [INFO][4622] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790 Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.297 [INFO][4622] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.316 [INFO][4622] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.316 [INFO][4622] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" host="localhost" Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.316 [INFO][4622] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:44.400990 containerd[1625]: 2025-10-28 00:31:44.316 [INFO][4622] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:44.402539 containerd[1625]: 2025-10-28 00:31:44.336 [INFO][4564] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78cfdf5758--2h92h-eth0", GenerateName:"whisker-78cfdf5758-", Namespace:"calico-system", SelfLink:"", UID:"ab512388-5b28-4703-8a4e-dbe34005b6c3", ResourceVersion:"898", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78cfdf5758", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-78cfdf5758-2h92h", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9305cbb1b89", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.402539 containerd[1625]: 2025-10-28 00:31:44.336 [INFO][4564] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:44.402539 containerd[1625]: 2025-10-28 00:31:44.337 [INFO][4564] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9305cbb1b89 ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:44.402539 containerd[1625]: 2025-10-28 00:31:44.357 [INFO][4564] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:44.402539 containerd[1625]: 2025-10-28 00:31:44.360 [INFO][4564] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78cfdf5758--2h92h-eth0", GenerateName:"whisker-78cfdf5758-", Namespace:"calico-system", SelfLink:"", UID:"ab512388-5b28-4703-8a4e-dbe34005b6c3", ResourceVersion:"898", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78cfdf5758", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790", Pod:"whisker-78cfdf5758-2h92h", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9305cbb1b89", MAC:"26:99:d1:5c:61:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.402539 containerd[1625]: 2025-10-28 00:31:44.386 [INFO][4564] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Namespace="calico-system" Pod="whisker-78cfdf5758-2h92h" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:44.440066 containerd[1625]: time="2025-10-28T00:31:44.440029914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f4bdd4695-jlv4b,Uid:96642e2b-a099-4486-a4d3-2cc6f34eac9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"0187474902e2b69c5a4999dc9b5bba322d08ca1d6fcdb618f02620aea4eb333c\"" Oct 28 00:31:44.513238 systemd-networkd[1294]: cali58c77b9f062: Link UP Oct 28 00:31:44.514012 systemd-networkd[1294]: cali58c77b9f062: Gained carrier Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.201 [INFO][4565] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0 coredns-674b8bbfcf- kube-system 410c3647-ec73-43b3-abf8-8d8ce60191b1 832 0 2025-10-28 00:31:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-5h5lj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali58c77b9f062 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.203 [INFO][4565] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.261 [INFO][4618] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" HandleID="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Workload="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.261 [INFO][4618] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" HandleID="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Workload="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033aef0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-5h5lj", "timestamp":"2025-10-28 00:31:44.261415948 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.261 [INFO][4618] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.316 [INFO][4618] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.316 [INFO][4618] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.373 [INFO][4618] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.406 [INFO][4618] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.415 [INFO][4618] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.419 [INFO][4618] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.424 [INFO][4618] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.426 [INFO][4618] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.433 [INFO][4618] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3 Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.462 [INFO][4618] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.505 [INFO][4618] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.505 [INFO][4618] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" host="localhost" Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.505 [INFO][4618] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:44.557625 containerd[1625]: 2025-10-28 00:31:44.505 [INFO][4618] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" HandleID="k8s-pod-network.73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Workload="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" Oct 28 00:31:44.568608 containerd[1625]: 2025-10-28 00:31:44.509 [INFO][4565] cni-plugin/k8s.go 418: Populated endpoint ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"410c3647-ec73-43b3-abf8-8d8ce60191b1", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-5h5lj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali58c77b9f062", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.568608 containerd[1625]: 2025-10-28 00:31:44.510 [INFO][4565] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" Oct 28 00:31:44.568608 containerd[1625]: 2025-10-28 00:31:44.510 [INFO][4565] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali58c77b9f062 ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" Oct 28 00:31:44.568608 containerd[1625]: 2025-10-28 00:31:44.514 [INFO][4565] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" Oct 28 00:31:44.568608 containerd[1625]: 2025-10-28 00:31:44.515 [INFO][4565] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"410c3647-ec73-43b3-abf8-8d8ce60191b1", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3", Pod:"coredns-674b8bbfcf-5h5lj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali58c77b9f062", MAC:"ae:89:8d:d4:73:79", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:44.568608 containerd[1625]: 2025-10-28 00:31:44.543 [INFO][4565] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" Namespace="kube-system" Pod="coredns-674b8bbfcf-5h5lj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5h5lj-eth0" Oct 28 00:31:44.569037 containerd[1625]: time="2025-10-28T00:31:44.569012271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-x6j9p,Uid:a3d67cc9-a4af-4a25-892e-b5ffc390b89f,Namespace:calico-system,Attempt:0,} returns sandbox id \"e89e96078601c2d0df34d466010b3bc0e10572db249ea40495b9ddef48725a80\"" Oct 28 00:31:44.647185 containerd[1625]: time="2025-10-28T00:31:44.647131346Z" level=info msg="connecting to shim d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" address="unix:///run/containerd/s/bff4412d03dca65c600c0da5c5845b3133a4ad159c317468ed3ba6921430b278" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:44.671415 systemd[1]: Started cri-containerd-d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790.scope - libcontainer container d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790. Oct 28 00:31:44.684802 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:44.704719 containerd[1625]: time="2025-10-28T00:31:44.704669062Z" level=info msg="connecting to shim 73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3" address="unix:///run/containerd/s/bdd035d1dda172d934a2d8ba664029f692ea20739124276e0e02c5ab0da17aff" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:44.732672 systemd[1]: Started cri-containerd-73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3.scope - libcontainer container 73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3. Oct 28 00:31:44.741542 containerd[1625]: time="2025-10-28T00:31:44.741463046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78cfdf5758-2h92h,Uid:ab512388-5b28-4703-8a4e-dbe34005b6c3,Namespace:calico-system,Attempt:0,} returns sandbox id \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\"" Oct 28 00:31:44.749954 containerd[1625]: time="2025-10-28T00:31:44.748755747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:31:44.765715 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:44.795264 containerd[1625]: time="2025-10-28T00:31:44.795055931Z" level=info msg="CreateContainer within sandbox \"9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 28 00:31:44.913599 containerd[1625]: time="2025-10-28T00:31:44.912865818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5h5lj,Uid:410c3647-ec73-43b3-abf8-8d8ce60191b1,Namespace:kube-system,Attempt:0,} returns sandbox id \"73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3\"" Oct 28 00:31:44.948459 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount798716574.mount: Deactivated successfully. Oct 28 00:31:44.950082 containerd[1625]: time="2025-10-28T00:31:44.950060490Z" level=info msg="CreateContainer within sandbox \"73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 28 00:31:44.955414 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3090463171.mount: Deactivated successfully. Oct 28 00:31:44.956654 containerd[1625]: time="2025-10-28T00:31:44.956533651Z" level=info msg="Container 577d231fc5f0248e182412d6d7db16e2cd82741a9a618d868037092eaf97a022: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:44.991832 containerd[1625]: time="2025-10-28T00:31:44.991794255Z" level=info msg="CreateContainer within sandbox \"9d7348aa9870f74fc839e83d3a249b42f936ca4fcd938fba721700b5a33c66bc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"577d231fc5f0248e182412d6d7db16e2cd82741a9a618d868037092eaf97a022\"" Oct 28 00:31:44.993781 containerd[1625]: time="2025-10-28T00:31:44.993769224Z" level=info msg="Container 9c441fe4f273a53b3aa4018eab795c7b2731603f02598e2bc385630129346db2: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:31:45.015764 containerd[1625]: time="2025-10-28T00:31:45.015736956Z" level=info msg="CreateContainer within sandbox \"73fde4fa646915ef2dd016444ea83a83333622485b58698c46f3ede2c04996b3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9c441fe4f273a53b3aa4018eab795c7b2731603f02598e2bc385630129346db2\"" Oct 28 00:31:45.029466 containerd[1625]: time="2025-10-28T00:31:45.029353661Z" level=info msg="StartContainer for \"9c441fe4f273a53b3aa4018eab795c7b2731603f02598e2bc385630129346db2\"" Oct 28 00:31:45.029704 containerd[1625]: time="2025-10-28T00:31:45.029595747Z" level=info msg="StartContainer for \"577d231fc5f0248e182412d6d7db16e2cd82741a9a618d868037092eaf97a022\"" Oct 28 00:31:45.030184 containerd[1625]: time="2025-10-28T00:31:45.030167827Z" level=info msg="connecting to shim 9c441fe4f273a53b3aa4018eab795c7b2731603f02598e2bc385630129346db2" address="unix:///run/containerd/s/bdd035d1dda172d934a2d8ba664029f692ea20739124276e0e02c5ab0da17aff" protocol=ttrpc version=3 Oct 28 00:31:45.030330 containerd[1625]: time="2025-10-28T00:31:45.030319056Z" level=info msg="connecting to shim 577d231fc5f0248e182412d6d7db16e2cd82741a9a618d868037092eaf97a022" address="unix:///run/containerd/s/c0807d39949c7923394261b5b542efb818a01f55e26f13ae57f9490771eecee6" protocol=ttrpc version=3 Oct 28 00:31:45.045684 systemd[1]: Started cri-containerd-9c441fe4f273a53b3aa4018eab795c7b2731603f02598e2bc385630129346db2.scope - libcontainer container 9c441fe4f273a53b3aa4018eab795c7b2731603f02598e2bc385630129346db2. Oct 28 00:31:45.048928 systemd[1]: Started cri-containerd-577d231fc5f0248e182412d6d7db16e2cd82741a9a618d868037092eaf97a022.scope - libcontainer container 577d231fc5f0248e182412d6d7db16e2cd82741a9a618d868037092eaf97a022. Oct 28 00:31:45.064589 kubelet[2926]: E1028 00:31:45.062963 2926 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab512388_5b28_4703_8a4e_dbe34005b6c3.slice/cri-containerd-d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790.scope\": RecentStats: unable to find data in memory cache]" Oct 28 00:31:45.131148 containerd[1625]: time="2025-10-28T00:31:45.131093662Z" level=info msg="StartContainer for \"9c441fe4f273a53b3aa4018eab795c7b2731603f02598e2bc385630129346db2\" returns successfully" Oct 28 00:31:45.133089 containerd[1625]: time="2025-10-28T00:31:45.133070062Z" level=info msg="StartContainer for \"577d231fc5f0248e182412d6d7db16e2cd82741a9a618d868037092eaf97a022\" returns successfully" Oct 28 00:31:45.190833 systemd-networkd[1294]: cali0249cfc6d03: Gained IPv6LL Oct 28 00:31:45.195457 containerd[1625]: time="2025-10-28T00:31:45.195355733Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:45.196808 containerd[1625]: time="2025-10-28T00:31:45.196651960Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:31:45.196808 containerd[1625]: time="2025-10-28T00:31:45.196785507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:31:45.204786 kubelet[2926]: E1028 00:31:45.204621 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:45.208213 kubelet[2926]: E1028 00:31:45.208076 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:45.208791 containerd[1625]: time="2025-10-28T00:31:45.208524846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:31:45.228377 kubelet[2926]: E1028 00:31:45.228312 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9j5lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64fb65988f-xxdqr_calico-apiserver(5296fefe-d676-448c-ac61-6435527489ed): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:45.254771 systemd-networkd[1294]: califea0d53fae9: Gained IPv6LL Oct 28 00:31:45.265873 kubelet[2926]: E1028 00:31:45.265816 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:31:45.384529 systemd-networkd[1294]: vxlan.calico: Gained IPv6LL Oct 28 00:31:45.384757 systemd-networkd[1294]: cali8c38614ded7: Gained IPv6LL Oct 28 00:31:45.398764 systemd[1]: Started sshd@7-139.178.70.100:22-185.156.73.233:56996.service - OpenSSH per-connection server daemon (185.156.73.233:56996). Oct 28 00:31:45.558031 containerd[1625]: time="2025-10-28T00:31:45.557877573Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:45.559666 containerd[1625]: time="2025-10-28T00:31:45.559592398Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:31:45.559666 containerd[1625]: time="2025-10-28T00:31:45.559647677Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:31:45.560929 kubelet[2926]: E1028 00:31:45.560667 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:45.560929 kubelet[2926]: E1028 00:31:45.560701 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:45.560929 kubelet[2926]: E1028 00:31:45.560889 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjp7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64d5cc589f-kfc7q_calico-apiserver(2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:45.561838 containerd[1625]: time="2025-10-28T00:31:45.561766787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 28 00:31:45.563364 kubelet[2926]: E1028 00:31:45.562897 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:31:45.703290 systemd-networkd[1294]: cali58c77b9f062: Gained IPv6LL Oct 28 00:31:45.830873 systemd-networkd[1294]: cali2c0b32ad188: Gained IPv6LL Oct 28 00:31:45.831552 systemd-networkd[1294]: cali60cf45501f6: Gained IPv6LL Oct 28 00:31:45.894757 systemd-networkd[1294]: cali9305cbb1b89: Gained IPv6LL Oct 28 00:31:45.939905 containerd[1625]: time="2025-10-28T00:31:45.939868930Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:45.972761 containerd[1625]: time="2025-10-28T00:31:45.972698257Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 28 00:31:45.972893 containerd[1625]: time="2025-10-28T00:31:45.972734573Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 28 00:31:45.972936 kubelet[2926]: E1028 00:31:45.972895 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:31:45.972936 kubelet[2926]: E1028 00:31:45.972923 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:31:45.973393 kubelet[2926]: E1028 00:31:45.973070 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:71fa32c1794749f18e2004eee7d3e458,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5t5zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78cfdf5758-2h92h_calico-system(ab512388-5b28-4703-8a4e-dbe34005b6c3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:45.973466 containerd[1625]: time="2025-10-28T00:31:45.973293007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 28 00:31:46.159860 kubelet[2926]: E1028 00:31:46.159699 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:31:46.161037 kubelet[2926]: E1028 00:31:46.160634 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:31:46.186403 kubelet[2926]: I1028 00:31:46.185960 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-zbpq4" podStartSLOduration=45.177911756 podStartE2EDuration="45.177911756s" podCreationTimestamp="2025-10-28 00:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:31:46.153253745 +0000 UTC m=+52.091975311" watchObservedRunningTime="2025-10-28 00:31:46.177911756 +0000 UTC m=+52.116633317" Oct 28 00:31:46.237950 kubelet[2926]: I1028 00:31:46.237881 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-5h5lj" podStartSLOduration=45.237863997 podStartE2EDuration="45.237863997s" podCreationTimestamp="2025-10-28 00:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:31:46.235293492 +0000 UTC m=+52.174015058" watchObservedRunningTime="2025-10-28 00:31:46.237863997 +0000 UTC m=+52.176585558" Oct 28 00:31:46.391417 containerd[1625]: time="2025-10-28T00:31:46.391345020Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:46.392186 containerd[1625]: time="2025-10-28T00:31:46.392078803Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 28 00:31:46.392186 containerd[1625]: time="2025-10-28T00:31:46.392160043Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 28 00:31:46.392469 kubelet[2926]: E1028 00:31:46.392394 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:31:46.392469 kubelet[2926]: E1028 00:31:46.392455 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:31:46.392808 kubelet[2926]: E1028 00:31:46.392716 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjdbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6f4bdd4695-jlv4b_calico-system(96642e2b-a099-4486-a4d3-2cc6f34eac9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:46.392966 containerd[1625]: time="2025-10-28T00:31:46.392921152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 28 00:31:46.394043 kubelet[2926]: E1028 00:31:46.393998 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:31:46.729028 containerd[1625]: time="2025-10-28T00:31:46.728985890Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:46.729487 containerd[1625]: time="2025-10-28T00:31:46.729444000Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 28 00:31:46.729566 containerd[1625]: time="2025-10-28T00:31:46.729503246Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 28 00:31:46.729681 kubelet[2926]: E1028 00:31:46.729638 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:31:46.729681 kubelet[2926]: E1028 00:31:46.729679 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:31:46.729911 kubelet[2926]: E1028 00:31:46.729853 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6blqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-x6j9p_calico-system(a3d67cc9-a4af-4a25-892e-b5ffc390b89f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:46.730801 containerd[1625]: time="2025-10-28T00:31:46.730773910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 28 00:31:46.730945 kubelet[2926]: E1028 00:31:46.730915 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:31:46.769826 sshd[4948]: Invalid user admin from 185.156.73.233 port 56996 Oct 28 00:31:46.924582 sshd[4948]: Connection closed by invalid user admin 185.156.73.233 port 56996 [preauth] Oct 28 00:31:46.925869 systemd[1]: sshd@7-139.178.70.100:22-185.156.73.233:56996.service: Deactivated successfully. Oct 28 00:31:47.083523 containerd[1625]: time="2025-10-28T00:31:47.083429661Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:47.084507 containerd[1625]: time="2025-10-28T00:31:47.083963004Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 28 00:31:47.084507 containerd[1625]: time="2025-10-28T00:31:47.083992838Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 28 00:31:47.084629 kubelet[2926]: E1028 00:31:47.084180 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:31:47.084629 kubelet[2926]: E1028 00:31:47.084212 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:31:47.084629 kubelet[2926]: E1028 00:31:47.084294 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5t5zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78cfdf5758-2h92h_calico-system(ab512388-5b28-4703-8a4e-dbe34005b6c3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:47.085801 kubelet[2926]: E1028 00:31:47.085755 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78cfdf5758-2h92h" podUID="ab512388-5b28-4703-8a4e-dbe34005b6c3" Oct 28 00:31:47.122093 kubelet[2926]: E1028 00:31:47.122017 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:31:47.123437 kubelet[2926]: E1028 00:31:47.123422 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:31:47.124805 containerd[1625]: time="2025-10-28T00:31:47.124784495Z" level=info msg="StopPodSandbox for \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\"" Oct 28 00:31:47.140060 systemd[1]: cri-containerd-d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790.scope: Deactivated successfully. Oct 28 00:31:47.144549 containerd[1625]: time="2025-10-28T00:31:47.144527132Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" id:\"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" pid:4800 exit_status:137 exited_at:{seconds:1761611507 nanos:142029458}" Oct 28 00:31:47.172398 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790-rootfs.mount: Deactivated successfully. Oct 28 00:31:47.178138 containerd[1625]: time="2025-10-28T00:31:47.178113886Z" level=info msg="shim disconnected" id=d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790 namespace=k8s.io Oct 28 00:31:47.178138 containerd[1625]: time="2025-10-28T00:31:47.178132541Z" level=warning msg="cleaning up after shim disconnected" id=d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790 namespace=k8s.io Oct 28 00:31:47.180068 containerd[1625]: time="2025-10-28T00:31:47.178139846Z" level=info msg="cleaning up dead shim" namespace=k8s.io Oct 28 00:31:47.243930 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790-shm.mount: Deactivated successfully. Oct 28 00:31:47.257202 containerd[1625]: time="2025-10-28T00:31:47.256828301Z" level=info msg="received exit event sandbox_id:\"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" exit_status:137 exited_at:{seconds:1761611507 nanos:142029458}" Oct 28 00:31:47.344978 systemd-networkd[1294]: cali9305cbb1b89: Link DOWN Oct 28 00:31:47.344984 systemd-networkd[1294]: cali9305cbb1b89: Lost carrier Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.343 [INFO][5015] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.343 [INFO][5015] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" iface="eth0" netns="/var/run/netns/cni-700623f7-e9ba-3de4-c485-7a57c0851ec7" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.343 [INFO][5015] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" iface="eth0" netns="/var/run/netns/cni-700623f7-e9ba-3de4-c485-7a57c0851ec7" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.349 [INFO][5015] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" after=6.054264ms iface="eth0" netns="/var/run/netns/cni-700623f7-e9ba-3de4-c485-7a57c0851ec7" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.349 [INFO][5015] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.349 [INFO][5015] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.382 [INFO][5024] ipam/ipam_plugin.go 436: Releasing address using handleID ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.383 [INFO][5024] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.383 [INFO][5024] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.415 [INFO][5024] ipam/ipam_plugin.go 455: Released address using handleID ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.415 [INFO][5024] ipam/ipam_plugin.go 464: Releasing address using workloadID ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.416 [INFO][5024] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:47.422070 containerd[1625]: 2025-10-28 00:31:47.418 [INFO][5015] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:47.423901 systemd[1]: run-netns-cni\x2d700623f7\x2de9ba\x2d3de4\x2dc485\x2d7a57c0851ec7.mount: Deactivated successfully. Oct 28 00:31:47.426766 containerd[1625]: time="2025-10-28T00:31:47.424439646Z" level=info msg="TearDown network for sandbox \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" successfully" Oct 28 00:31:47.426766 containerd[1625]: time="2025-10-28T00:31:47.426116694Z" level=info msg="StopPodSandbox for \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" returns successfully" Oct 28 00:31:47.487365 kubelet[2926]: I1028 00:31:47.487259 2926 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t5zc\" (UniqueName: \"kubernetes.io/projected/ab512388-5b28-4703-8a4e-dbe34005b6c3-kube-api-access-5t5zc\") pod \"ab512388-5b28-4703-8a4e-dbe34005b6c3\" (UID: \"ab512388-5b28-4703-8a4e-dbe34005b6c3\") " Oct 28 00:31:47.487365 kubelet[2926]: I1028 00:31:47.487311 2926 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-ca-bundle\") pod \"ab512388-5b28-4703-8a4e-dbe34005b6c3\" (UID: \"ab512388-5b28-4703-8a4e-dbe34005b6c3\") " Oct 28 00:31:47.487365 kubelet[2926]: I1028 00:31:47.487332 2926 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-backend-key-pair\") pod \"ab512388-5b28-4703-8a4e-dbe34005b6c3\" (UID: \"ab512388-5b28-4703-8a4e-dbe34005b6c3\") " Oct 28 00:31:47.502601 systemd[1]: var-lib-kubelet-pods-ab512388\x2d5b28\x2d4703\x2d8a4e\x2ddbe34005b6c3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5t5zc.mount: Deactivated successfully. Oct 28 00:31:47.505327 systemd[1]: var-lib-kubelet-pods-ab512388\x2d5b28\x2d4703\x2d8a4e\x2ddbe34005b6c3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Oct 28 00:31:47.512238 kubelet[2926]: I1028 00:31:47.511317 2926 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ab512388-5b28-4703-8a4e-dbe34005b6c3" (UID: "ab512388-5b28-4703-8a4e-dbe34005b6c3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Oct 28 00:31:47.512329 kubelet[2926]: I1028 00:31:47.511623 2926 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab512388-5b28-4703-8a4e-dbe34005b6c3-kube-api-access-5t5zc" (OuterVolumeSpecName: "kube-api-access-5t5zc") pod "ab512388-5b28-4703-8a4e-dbe34005b6c3" (UID: "ab512388-5b28-4703-8a4e-dbe34005b6c3"). InnerVolumeSpecName "kube-api-access-5t5zc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Oct 28 00:31:47.512329 kubelet[2926]: I1028 00:31:47.512301 2926 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ab512388-5b28-4703-8a4e-dbe34005b6c3" (UID: "ab512388-5b28-4703-8a4e-dbe34005b6c3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Oct 28 00:31:47.587984 kubelet[2926]: I1028 00:31:47.587961 2926 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5t5zc\" (UniqueName: \"kubernetes.io/projected/ab512388-5b28-4703-8a4e-dbe34005b6c3-kube-api-access-5t5zc\") on node \"localhost\" DevicePath \"\"" Oct 28 00:31:47.587984 kubelet[2926]: I1028 00:31:47.587978 2926 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Oct 28 00:31:47.587984 kubelet[2926]: I1028 00:31:47.587985 2926 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ab512388-5b28-4703-8a4e-dbe34005b6c3-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Oct 28 00:31:48.155246 systemd[1]: Removed slice kubepods-besteffort-podab512388_5b28_4703_8a4e_dbe34005b6c3.slice - libcontainer container kubepods-besteffort-podab512388_5b28_4703_8a4e_dbe34005b6c3.slice. Oct 28 00:31:48.244664 systemd[1]: Created slice kubepods-besteffort-poda4162029_5497_4f8b_a7bf_68541ba5fac8.slice - libcontainer container kubepods-besteffort-poda4162029_5497_4f8b_a7bf_68541ba5fac8.slice. Oct 28 00:31:48.291801 kubelet[2926]: I1028 00:31:48.291773 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpmfp\" (UniqueName: \"kubernetes.io/projected/a4162029-5497-4f8b-a7bf-68541ba5fac8-kube-api-access-tpmfp\") pod \"whisker-765696dbb4-7zljh\" (UID: \"a4162029-5497-4f8b-a7bf-68541ba5fac8\") " pod="calico-system/whisker-765696dbb4-7zljh" Oct 28 00:31:48.293072 kubelet[2926]: I1028 00:31:48.293044 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4162029-5497-4f8b-a7bf-68541ba5fac8-whisker-ca-bundle\") pod \"whisker-765696dbb4-7zljh\" (UID: \"a4162029-5497-4f8b-a7bf-68541ba5fac8\") " pod="calico-system/whisker-765696dbb4-7zljh" Oct 28 00:31:48.293306 kubelet[2926]: I1028 00:31:48.293089 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a4162029-5497-4f8b-a7bf-68541ba5fac8-whisker-backend-key-pair\") pod \"whisker-765696dbb4-7zljh\" (UID: \"a4162029-5497-4f8b-a7bf-68541ba5fac8\") " pod="calico-system/whisker-765696dbb4-7zljh" Oct 28 00:31:48.556300 containerd[1625]: time="2025-10-28T00:31:48.556258508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-765696dbb4-7zljh,Uid:a4162029-5497-4f8b-a7bf-68541ba5fac8,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:48.657712 systemd-networkd[1294]: cali3ebe50f790c: Link UP Oct 28 00:31:48.657834 systemd-networkd[1294]: cali3ebe50f790c: Gained carrier Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.589 [INFO][5046] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--765696dbb4--7zljh-eth0 whisker-765696dbb4- calico-system a4162029-5497-4f8b-a7bf-68541ba5fac8 1028 0 2025-10-28 00:31:48 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:765696dbb4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-765696dbb4-7zljh eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3ebe50f790c [] [] }} ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.589 [INFO][5046] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-eth0" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.605 [INFO][5057] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" HandleID="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Workload="localhost-k8s-whisker--765696dbb4--7zljh-eth0" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.605 [INFO][5057] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" HandleID="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Workload="localhost-k8s-whisker--765696dbb4--7zljh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d50b0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-765696dbb4-7zljh", "timestamp":"2025-10-28 00:31:48.605072029 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.605 [INFO][5057] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.605 [INFO][5057] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.605 [INFO][5057] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.609 [INFO][5057] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.612 [INFO][5057] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.615 [INFO][5057] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.616 [INFO][5057] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.617 [INFO][5057] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.617 [INFO][5057] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.618 [INFO][5057] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0 Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.632 [INFO][5057] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.647 [INFO][5057] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.647 [INFO][5057] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" host="localhost" Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.647 [INFO][5057] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:48.677512 containerd[1625]: 2025-10-28 00:31:48.647 [INFO][5057] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" HandleID="k8s-pod-network.a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Workload="localhost-k8s-whisker--765696dbb4--7zljh-eth0" Oct 28 00:31:48.678146 containerd[1625]: 2025-10-28 00:31:48.648 [INFO][5046] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--765696dbb4--7zljh-eth0", GenerateName:"whisker-765696dbb4-", Namespace:"calico-system", SelfLink:"", UID:"a4162029-5497-4f8b-a7bf-68541ba5fac8", ResourceVersion:"1028", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"765696dbb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-765696dbb4-7zljh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3ebe50f790c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:48.678146 containerd[1625]: 2025-10-28 00:31:48.649 [INFO][5046] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-eth0" Oct 28 00:31:48.678146 containerd[1625]: 2025-10-28 00:31:48.649 [INFO][5046] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3ebe50f790c ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-eth0" Oct 28 00:31:48.678146 containerd[1625]: 2025-10-28 00:31:48.658 [INFO][5046] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-eth0" Oct 28 00:31:48.678146 containerd[1625]: 2025-10-28 00:31:48.659 [INFO][5046] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--765696dbb4--7zljh-eth0", GenerateName:"whisker-765696dbb4-", Namespace:"calico-system", SelfLink:"", UID:"a4162029-5497-4f8b-a7bf-68541ba5fac8", ResourceVersion:"1028", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"765696dbb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0", Pod:"whisker-765696dbb4-7zljh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3ebe50f790c", MAC:"8a:15:13:09:b8:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:48.678146 containerd[1625]: 2025-10-28 00:31:48.674 [INFO][5046] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" Namespace="calico-system" Pod="whisker-765696dbb4-7zljh" WorkloadEndpoint="localhost-k8s-whisker--765696dbb4--7zljh-eth0" Oct 28 00:31:48.753642 containerd[1625]: time="2025-10-28T00:31:48.753606114Z" level=info msg="connecting to shim a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0" address="unix:///run/containerd/s/9f4a09ae0c855f4a4509dd64675aa1aa921e216c64086a0c01b4fda737a5d7bb" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:48.772660 systemd[1]: Started cri-containerd-a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0.scope - libcontainer container a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0. Oct 28 00:31:48.782181 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:48.811835 containerd[1625]: time="2025-10-28T00:31:48.811772363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-765696dbb4-7zljh,Uid:a4162029-5497-4f8b-a7bf-68541ba5fac8,Namespace:calico-system,Attempt:0,} returns sandbox id \"a39cbd67790b22dc90afce261128d66336529e01cc6a3c0702adcb2500bd1fd0\"" Oct 28 00:31:48.813186 containerd[1625]: time="2025-10-28T00:31:48.813169136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 28 00:31:49.134938 containerd[1625]: time="2025-10-28T00:31:49.134770851Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:49.135426 containerd[1625]: time="2025-10-28T00:31:49.135354224Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 28 00:31:49.135426 containerd[1625]: time="2025-10-28T00:31:49.135413977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 28 00:31:49.135510 kubelet[2926]: E1028 00:31:49.135491 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:31:49.135732 kubelet[2926]: E1028 00:31:49.135531 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:31:49.135732 kubelet[2926]: E1028 00:31:49.135632 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:71fa32c1794749f18e2004eee7d3e458,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tpmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-765696dbb4-7zljh_calico-system(a4162029-5497-4f8b-a7bf-68541ba5fac8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:49.137657 containerd[1625]: time="2025-10-28T00:31:49.137527156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 28 00:31:49.503093 containerd[1625]: time="2025-10-28T00:31:49.502793439Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:49.503184 containerd[1625]: time="2025-10-28T00:31:49.503100136Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 28 00:31:49.503184 containerd[1625]: time="2025-10-28T00:31:49.503164515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 28 00:31:49.503307 kubelet[2926]: E1028 00:31:49.503279 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:31:49.503391 kubelet[2926]: E1028 00:31:49.503317 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:31:49.504287 kubelet[2926]: E1028 00:31:49.504253 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-765696dbb4-7zljh_calico-system(a4162029-5497-4f8b-a7bf-68541ba5fac8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:49.518641 kubelet[2926]: E1028 00:31:49.517619 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-765696dbb4-7zljh" podUID="a4162029-5497-4f8b-a7bf-68541ba5fac8" Oct 28 00:31:50.055290 systemd-networkd[1294]: cali3ebe50f790c: Gained IPv6LL Oct 28 00:31:50.129082 kubelet[2926]: E1028 00:31:50.129028 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-765696dbb4-7zljh" podUID="a4162029-5497-4f8b-a7bf-68541ba5fac8" Oct 28 00:31:50.149081 kubelet[2926]: I1028 00:31:50.149053 2926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab512388-5b28-4703-8a4e-dbe34005b6c3" path="/var/lib/kubelet/pods/ab512388-5b28-4703-8a4e-dbe34005b6c3/volumes" Oct 28 00:31:52.148565 containerd[1625]: time="2025-10-28T00:31:52.148414942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9vxm,Uid:bd5600ff-882d-4fd0-9a0a-4d9435b64027,Namespace:calico-system,Attempt:0,}" Oct 28 00:31:52.217754 systemd-networkd[1294]: cali02c0479dd11: Link UP Oct 28 00:31:52.218600 systemd-networkd[1294]: cali02c0479dd11: Gained carrier Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.172 [INFO][5131] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--p9vxm-eth0 csi-node-driver- calico-system bd5600ff-882d-4fd0-9a0a-4d9435b64027 724 0 2025-10-28 00:31:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-p9vxm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali02c0479dd11 [] [] }} ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.172 [INFO][5131] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-eth0" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.189 [INFO][5142] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" HandleID="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Workload="localhost-k8s-csi--node--driver--p9vxm-eth0" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.189 [INFO][5142] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" HandleID="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Workload="localhost-k8s-csi--node--driver--p9vxm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024ef50), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-p9vxm", "timestamp":"2025-10-28 00:31:52.189744019 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.189 [INFO][5142] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.189 [INFO][5142] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.189 [INFO][5142] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.194 [INFO][5142] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.197 [INFO][5142] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.200 [INFO][5142] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.201 [INFO][5142] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.203 [INFO][5142] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.203 [INFO][5142] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.205 [INFO][5142] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968 Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.207 [INFO][5142] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.210 [INFO][5142] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.210 [INFO][5142] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" host="localhost" Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.210 [INFO][5142] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:52.230294 containerd[1625]: 2025-10-28 00:31:52.210 [INFO][5142] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" HandleID="k8s-pod-network.9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Workload="localhost-k8s-csi--node--driver--p9vxm-eth0" Oct 28 00:31:52.231322 containerd[1625]: 2025-10-28 00:31:52.212 [INFO][5131] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--p9vxm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bd5600ff-882d-4fd0-9a0a-4d9435b64027", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-p9vxm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali02c0479dd11", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:52.231322 containerd[1625]: 2025-10-28 00:31:52.212 [INFO][5131] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-eth0" Oct 28 00:31:52.231322 containerd[1625]: 2025-10-28 00:31:52.212 [INFO][5131] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali02c0479dd11 ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-eth0" Oct 28 00:31:52.231322 containerd[1625]: 2025-10-28 00:31:52.219 [INFO][5131] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-eth0" Oct 28 00:31:52.231322 containerd[1625]: 2025-10-28 00:31:52.219 [INFO][5131] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--p9vxm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bd5600ff-882d-4fd0-9a0a-4d9435b64027", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968", Pod:"csi-node-driver-p9vxm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali02c0479dd11", MAC:"f6:ac:17:cd:08:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:52.231322 containerd[1625]: 2025-10-28 00:31:52.227 [INFO][5131] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" Namespace="calico-system" Pod="csi-node-driver-p9vxm" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9vxm-eth0" Oct 28 00:31:52.250122 containerd[1625]: time="2025-10-28T00:31:52.249485145Z" level=info msg="connecting to shim 9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968" address="unix:///run/containerd/s/00ce6da2c10e85cfee4030687b2555939a10a59157587f55de97ec4db401f898" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:52.270691 systemd[1]: Started cri-containerd-9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968.scope - libcontainer container 9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968. Oct 28 00:31:52.279138 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:52.288898 containerd[1625]: time="2025-10-28T00:31:52.288866130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9vxm,Uid:bd5600ff-882d-4fd0-9a0a-4d9435b64027,Namespace:calico-system,Attempt:0,} returns sandbox id \"9cab15321a2caed1a779203f5d8f553fcabf1ec4f184e361c60ebcae382f6968\"" Oct 28 00:31:52.290050 containerd[1625]: time="2025-10-28T00:31:52.290033545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 28 00:31:52.633803 containerd[1625]: time="2025-10-28T00:31:52.633692136Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:52.634051 containerd[1625]: time="2025-10-28T00:31:52.634012166Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 28 00:31:52.634130 containerd[1625]: time="2025-10-28T00:31:52.634064377Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 28 00:31:52.634175 kubelet[2926]: E1028 00:31:52.634157 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:31:52.634425 kubelet[2926]: E1028 00:31:52.634187 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:31:52.634425 kubelet[2926]: E1028 00:31:52.634274 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:52.637279 containerd[1625]: time="2025-10-28T00:31:52.637248226Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 28 00:31:52.974197 containerd[1625]: time="2025-10-28T00:31:52.974158939Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:52.975885 containerd[1625]: time="2025-10-28T00:31:52.975816848Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 28 00:31:52.975885 containerd[1625]: time="2025-10-28T00:31:52.975870795Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 28 00:31:52.976020 kubelet[2926]: E1028 00:31:52.975985 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:31:52.976061 kubelet[2926]: E1028 00:31:52.976020 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:31:52.976151 kubelet[2926]: E1028 00:31:52.976123 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:52.977368 kubelet[2926]: E1028 00:31:52.977327 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:53.135870 kubelet[2926]: E1028 00:31:53.135831 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:53.702795 systemd-networkd[1294]: cali02c0479dd11: Gained IPv6LL Oct 28 00:31:54.128195 containerd[1625]: time="2025-10-28T00:31:54.128124150Z" level=info msg="StopPodSandbox for \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\"" Oct 28 00:31:54.139060 kubelet[2926]: E1028 00:31:54.139036 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:31:54.149225 containerd[1625]: time="2025-10-28T00:31:54.149118705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-jn72s,Uid:491a933d-8deb-47d2-a1f5-45928f657a21,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.191 [WARNING][5212] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.191 [INFO][5212] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.191 [INFO][5212] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" iface="eth0" netns="" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.191 [INFO][5212] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.191 [INFO][5212] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.213 [INFO][5231] ipam/ipam_plugin.go 436: Releasing address using handleID ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.213 [INFO][5231] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.214 [INFO][5231] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.219 [WARNING][5231] ipam/ipam_plugin.go 453: Asked to release address but it doesn't exist. Ignoring ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.219 [INFO][5231] ipam/ipam_plugin.go 464: Releasing address using workloadID ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.221 [INFO][5231] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:54.224645 containerd[1625]: 2025-10-28 00:31:54.223 [INFO][5212] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.225377 containerd[1625]: time="2025-10-28T00:31:54.224673133Z" level=info msg="TearDown network for sandbox \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" successfully" Oct 28 00:31:54.225377 containerd[1625]: time="2025-10-28T00:31:54.224695302Z" level=info msg="StopPodSandbox for \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" returns successfully" Oct 28 00:31:54.225377 containerd[1625]: time="2025-10-28T00:31:54.225054845Z" level=info msg="RemovePodSandbox for \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\"" Oct 28 00:31:54.225377 containerd[1625]: time="2025-10-28T00:31:54.225073569Z" level=info msg="Forcibly stopping sandbox \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\"" Oct 28 00:31:54.263780 systemd-networkd[1294]: calid02c2b1d3bd: Link UP Oct 28 00:31:54.265655 systemd-networkd[1294]: calid02c2b1d3bd: Gained carrier Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.203 [INFO][5217] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0 calico-apiserver-64d5cc589f- calico-apiserver 491a933d-8deb-47d2-a1f5-45928f657a21 838 0 2025-10-28 00:31:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64d5cc589f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64d5cc589f-jn72s eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid02c2b1d3bd [] [] }} ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.204 [INFO][5217] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.226 [INFO][5238] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" HandleID="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Workload="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.226 [INFO][5238] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" HandleID="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Workload="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5210), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64d5cc589f-jn72s", "timestamp":"2025-10-28 00:31:54.226123816 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.226 [INFO][5238] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.226 [INFO][5238] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.226 [INFO][5238] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.232 [INFO][5238] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.235 [INFO][5238] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.240 [INFO][5238] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.242 [INFO][5238] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.243 [INFO][5238] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.243 [INFO][5238] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.244 [INFO][5238] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.248 [INFO][5238] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.256 [INFO][5238] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.138/26] block=192.168.88.128/26 handle="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.256 [INFO][5238] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.138/26] handle="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" host="localhost" Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.256 [INFO][5238] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:54.278903 containerd[1625]: 2025-10-28 00:31:54.256 [INFO][5238] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.138/26] IPv6=[] ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" HandleID="k8s-pod-network.7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Workload="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" Oct 28 00:31:54.279375 containerd[1625]: 2025-10-28 00:31:54.259 [INFO][5217] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0", GenerateName:"calico-apiserver-64d5cc589f-", Namespace:"calico-apiserver", SelfLink:"", UID:"491a933d-8deb-47d2-a1f5-45928f657a21", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d5cc589f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64d5cc589f-jn72s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid02c2b1d3bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:54.279375 containerd[1625]: 2025-10-28 00:31:54.259 [INFO][5217] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.138/32] ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" Oct 28 00:31:54.279375 containerd[1625]: 2025-10-28 00:31:54.259 [INFO][5217] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid02c2b1d3bd ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" Oct 28 00:31:54.279375 containerd[1625]: 2025-10-28 00:31:54.266 [INFO][5217] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" Oct 28 00:31:54.279375 containerd[1625]: 2025-10-28 00:31:54.266 [INFO][5217] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0", GenerateName:"calico-apiserver-64d5cc589f-", Namespace:"calico-apiserver", SelfLink:"", UID:"491a933d-8deb-47d2-a1f5-45928f657a21", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 31, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d5cc589f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d", Pod:"calico-apiserver-64d5cc589f-jn72s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid02c2b1d3bd", MAC:"7e:43:b8:0b:67:55", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:31:54.279375 containerd[1625]: 2025-10-28 00:31:54.272 [INFO][5217] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" Namespace="calico-apiserver" Pod="calico-apiserver-64d5cc589f-jn72s" WorkloadEndpoint="localhost-k8s-calico--apiserver--64d5cc589f--jn72s-eth0" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.260 [WARNING][5254] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" WorkloadEndpoint="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.260 [INFO][5254] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.260 [INFO][5254] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" iface="eth0" netns="" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.260 [INFO][5254] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.260 [INFO][5254] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.300 [INFO][5263] ipam/ipam_plugin.go 436: Releasing address using handleID ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.300 [INFO][5263] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.300 [INFO][5263] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.305 [WARNING][5263] ipam/ipam_plugin.go 453: Asked to release address but it doesn't exist. Ignoring ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.305 [INFO][5263] ipam/ipam_plugin.go 464: Releasing address using workloadID ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" HandleID="k8s-pod-network.d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Workload="localhost-k8s-whisker--78cfdf5758--2h92h-eth0" Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.306 [INFO][5263] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:31:54.308729 containerd[1625]: 2025-10-28 00:31:54.307 [INFO][5254] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790" Oct 28 00:31:54.309024 containerd[1625]: time="2025-10-28T00:31:54.308748051Z" level=info msg="TearDown network for sandbox \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" successfully" Oct 28 00:31:54.318384 containerd[1625]: time="2025-10-28T00:31:54.317664223Z" level=info msg="Ensure that sandbox d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790 in task-service has been cleanup successfully" Oct 28 00:31:54.324938 containerd[1625]: time="2025-10-28T00:31:54.324915023Z" level=info msg="RemovePodSandbox \"d6b4cf59538ea283e66128f0300a69564891c2f9c9945088d52317492c02b790\" returns successfully" Oct 28 00:31:54.325252 containerd[1625]: time="2025-10-28T00:31:54.325182222Z" level=info msg="connecting to shim 7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d" address="unix:///run/containerd/s/41835cc88c3164a7907b882f561a787a366dc43e8efe2fcee449748839901d21" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:31:54.347757 systemd[1]: Started cri-containerd-7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d.scope - libcontainer container 7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d. Oct 28 00:31:54.356602 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:31:54.387165 containerd[1625]: time="2025-10-28T00:31:54.387077379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d5cc589f-jn72s,Uid:491a933d-8deb-47d2-a1f5-45928f657a21,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7cd21adbe85a7e837d9623be3aeeb48c4660066695d1155aebec1ff26de4140d\"" Oct 28 00:31:54.388682 containerd[1625]: time="2025-10-28T00:31:54.388667789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:31:54.738939 containerd[1625]: time="2025-10-28T00:31:54.738899641Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:54.753702 containerd[1625]: time="2025-10-28T00:31:54.753643284Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:31:54.753936 containerd[1625]: time="2025-10-28T00:31:54.753738360Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:31:54.753972 kubelet[2926]: E1028 00:31:54.753872 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:54.754172 kubelet[2926]: E1028 00:31:54.754045 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:54.754287 kubelet[2926]: E1028 00:31:54.754244 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmbdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64d5cc589f-jn72s_calico-apiserver(491a933d-8deb-47d2-a1f5-45928f657a21): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:54.755533 kubelet[2926]: E1028 00:31:54.755470 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:31:55.140962 kubelet[2926]: E1028 00:31:55.140703 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:31:55.942725 systemd-networkd[1294]: calid02c2b1d3bd: Gained IPv6LL Oct 28 00:31:56.143111 kubelet[2926]: E1028 00:31:56.142877 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:31:59.149178 containerd[1625]: time="2025-10-28T00:31:59.148900198Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:31:59.462895 containerd[1625]: time="2025-10-28T00:31:59.462791553Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:31:59.468350 containerd[1625]: time="2025-10-28T00:31:59.468300594Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:31:59.468467 containerd[1625]: time="2025-10-28T00:31:59.468388838Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:31:59.468553 kubelet[2926]: E1028 00:31:59.468519 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:59.468875 kubelet[2926]: E1028 00:31:59.468561 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:31:59.468875 kubelet[2926]: E1028 00:31:59.468688 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjp7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64d5cc589f-kfc7q_calico-apiserver(2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:31:59.470712 kubelet[2926]: E1028 00:31:59.470684 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:32:00.149440 containerd[1625]: time="2025-10-28T00:32:00.149213359Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:32:00.468954 containerd[1625]: time="2025-10-28T00:32:00.468751978Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:00.469246 containerd[1625]: time="2025-10-28T00:32:00.469216993Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:32:00.469324 containerd[1625]: time="2025-10-28T00:32:00.469300756Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:32:00.469508 kubelet[2926]: E1028 00:32:00.469463 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:00.470516 kubelet[2926]: E1028 00:32:00.469512 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:00.470516 kubelet[2926]: E1028 00:32:00.469823 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9j5lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64fb65988f-xxdqr_calico-apiserver(5296fefe-d676-448c-ac61-6435527489ed): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:00.471504 kubelet[2926]: E1028 00:32:00.471365 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:32:00.476176 containerd[1625]: time="2025-10-28T00:32:00.476098240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 28 00:32:00.804685 containerd[1625]: time="2025-10-28T00:32:00.804526743Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:00.834608 containerd[1625]: time="2025-10-28T00:32:00.834541602Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 28 00:32:00.834774 containerd[1625]: time="2025-10-28T00:32:00.834564084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 28 00:32:00.834811 kubelet[2926]: E1028 00:32:00.834775 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:32:00.834859 kubelet[2926]: E1028 00:32:00.834815 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:32:00.834988 kubelet[2926]: E1028 00:32:00.834933 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6blqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-x6j9p_calico-system(a3d67cc9-a4af-4a25-892e-b5ffc390b89f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:00.836260 kubelet[2926]: E1028 00:32:00.836215 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:32:01.148198 containerd[1625]: time="2025-10-28T00:32:01.148085969Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 28 00:32:01.511779 containerd[1625]: time="2025-10-28T00:32:01.511736371Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:01.512246 containerd[1625]: time="2025-10-28T00:32:01.512226488Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 28 00:32:01.512330 containerd[1625]: time="2025-10-28T00:32:01.512308028Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 28 00:32:01.512510 kubelet[2926]: E1028 00:32:01.512483 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:32:01.512701 kubelet[2926]: E1028 00:32:01.512521 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:32:01.512701 kubelet[2926]: E1028 00:32:01.512624 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjdbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6f4bdd4695-jlv4b_calico-system(96642e2b-a099-4486-a4d3-2cc6f34eac9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:01.514023 kubelet[2926]: E1028 00:32:01.513997 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:32:05.149300 containerd[1625]: time="2025-10-28T00:32:05.149040524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 28 00:32:05.532927 containerd[1625]: time="2025-10-28T00:32:05.532759178Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:05.533211 containerd[1625]: time="2025-10-28T00:32:05.533193184Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 28 00:32:05.533295 containerd[1625]: time="2025-10-28T00:32:05.533259361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 28 00:32:05.533395 kubelet[2926]: E1028 00:32:05.533366 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:32:05.533781 kubelet[2926]: E1028 00:32:05.533403 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:32:05.533781 kubelet[2926]: E1028 00:32:05.533476 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:71fa32c1794749f18e2004eee7d3e458,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tpmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-765696dbb4-7zljh_calico-system(a4162029-5497-4f8b-a7bf-68541ba5fac8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:05.535304 containerd[1625]: time="2025-10-28T00:32:05.535284889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 28 00:32:05.908001 containerd[1625]: time="2025-10-28T00:32:05.907913882Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:05.917109 containerd[1625]: time="2025-10-28T00:32:05.917076851Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 28 00:32:05.917186 containerd[1625]: time="2025-10-28T00:32:05.917138575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 28 00:32:05.917245 kubelet[2926]: E1028 00:32:05.917216 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:32:05.917290 kubelet[2926]: E1028 00:32:05.917253 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:32:05.917430 kubelet[2926]: E1028 00:32:05.917332 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-765696dbb4-7zljh_calico-system(a4162029-5497-4f8b-a7bf-68541ba5fac8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:05.918651 kubelet[2926]: E1028 00:32:05.918618 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-765696dbb4-7zljh" podUID="a4162029-5497-4f8b-a7bf-68541ba5fac8" Oct 28 00:32:07.149522 containerd[1625]: time="2025-10-28T00:32:07.148860959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:32:07.611740 containerd[1625]: time="2025-10-28T00:32:07.611703886Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:07.617106 containerd[1625]: time="2025-10-28T00:32:07.617084146Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:32:07.617194 containerd[1625]: time="2025-10-28T00:32:07.617127578Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:32:07.617235 kubelet[2926]: E1028 00:32:07.617210 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:07.617437 kubelet[2926]: E1028 00:32:07.617240 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:07.617437 kubelet[2926]: E1028 00:32:07.617323 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmbdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64d5cc589f-jn72s_calico-apiserver(491a933d-8deb-47d2-a1f5-45928f657a21): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:07.618843 kubelet[2926]: E1028 00:32:07.618823 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:32:08.147952 containerd[1625]: time="2025-10-28T00:32:08.147659821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 28 00:32:08.482649 containerd[1625]: time="2025-10-28T00:32:08.482558497Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:08.483063 containerd[1625]: time="2025-10-28T00:32:08.482919579Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 28 00:32:08.483063 containerd[1625]: time="2025-10-28T00:32:08.482974965Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 28 00:32:08.483165 kubelet[2926]: E1028 00:32:08.483111 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:32:08.483209 kubelet[2926]: E1028 00:32:08.483174 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:32:08.484029 kubelet[2926]: E1028 00:32:08.483310 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:08.484983 containerd[1625]: time="2025-10-28T00:32:08.484965840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 28 00:32:08.979593 containerd[1625]: time="2025-10-28T00:32:08.979535520Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:08.979937 containerd[1625]: time="2025-10-28T00:32:08.979912338Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 28 00:32:08.980010 containerd[1625]: time="2025-10-28T00:32:08.979925208Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 28 00:32:08.980126 kubelet[2926]: E1028 00:32:08.980090 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:32:08.980408 kubelet[2926]: E1028 00:32:08.980132 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:32:08.980408 kubelet[2926]: E1028 00:32:08.980221 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:08.981673 kubelet[2926]: E1028 00:32:08.981643 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:32:09.355915 containerd[1625]: time="2025-10-28T00:32:09.355842460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\" id:\"f91cd2cfee866cbcd1b71dae08246a8b488ab9faebe7e8c6bd18c63d816dc92a\" pid:5359 exited_at:{seconds:1761611529 nanos:355530323}" Oct 28 00:32:12.149835 kubelet[2926]: E1028 00:32:12.149772 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:32:13.148634 kubelet[2926]: E1028 00:32:13.148592 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:32:13.148634 kubelet[2926]: E1028 00:32:13.148604 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:32:14.148477 kubelet[2926]: E1028 00:32:14.148435 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:32:18.149705 kubelet[2926]: E1028 00:32:18.149671 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-765696dbb4-7zljh" podUID="a4162029-5497-4f8b-a7bf-68541ba5fac8" Oct 28 00:32:20.151675 kubelet[2926]: E1028 00:32:20.150920 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:32:22.149272 kubelet[2926]: E1028 00:32:22.149221 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:32:23.148236 containerd[1625]: time="2025-10-28T00:32:23.148189623Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:32:23.622258 containerd[1625]: time="2025-10-28T00:32:23.622102549Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:23.630491 containerd[1625]: time="2025-10-28T00:32:23.630471966Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:32:23.630621 containerd[1625]: time="2025-10-28T00:32:23.630524916Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:32:23.630755 kubelet[2926]: E1028 00:32:23.630727 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:23.630985 kubelet[2926]: E1028 00:32:23.630761 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:23.630985 kubelet[2926]: E1028 00:32:23.630852 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9j5lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64fb65988f-xxdqr_calico-apiserver(5296fefe-d676-448c-ac61-6435527489ed): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:23.632014 kubelet[2926]: E1028 00:32:23.631971 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:32:24.183709 systemd[1]: Started sshd@8-139.178.70.100:22-139.178.89.65:35356.service - OpenSSH per-connection server daemon (139.178.89.65:35356). Oct 28 00:32:24.249462 sshd[5381]: Accepted publickey for core from 139.178.89.65 port 35356 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:24.251045 sshd-session[5381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:24.256880 systemd-logind[1597]: New session 10 of user core. Oct 28 00:32:24.261738 systemd[1]: Started session-10.scope - Session 10 of User core. Oct 28 00:32:24.828964 sshd[5384]: Connection closed by 139.178.89.65 port 35356 Oct 28 00:32:24.829433 sshd-session[5381]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:24.892296 systemd-logind[1597]: Session 10 logged out. Waiting for processes to exit. Oct 28 00:32:24.892320 systemd[1]: sshd@8-139.178.70.100:22-139.178.89.65:35356.service: Deactivated successfully. Oct 28 00:32:24.894667 systemd[1]: session-10.scope: Deactivated successfully. Oct 28 00:32:24.896689 systemd-logind[1597]: Removed session 10. Oct 28 00:32:26.148811 containerd[1625]: time="2025-10-28T00:32:26.148733678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 28 00:32:26.485082 containerd[1625]: time="2025-10-28T00:32:26.485045266Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:26.485370 containerd[1625]: time="2025-10-28T00:32:26.485348098Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 28 00:32:26.485412 containerd[1625]: time="2025-10-28T00:32:26.485400515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 28 00:32:26.485545 kubelet[2926]: E1028 00:32:26.485524 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:32:26.485920 kubelet[2926]: E1028 00:32:26.485775 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:32:26.485920 kubelet[2926]: E1028 00:32:26.485873 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6blqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-x6j9p_calico-system(a3d67cc9-a4af-4a25-892e-b5ffc390b89f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:26.487282 kubelet[2926]: E1028 00:32:26.487257 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:32:28.149030 containerd[1625]: time="2025-10-28T00:32:28.148933168Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 28 00:32:28.558134 containerd[1625]: time="2025-10-28T00:32:28.558020935Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:28.558388 containerd[1625]: time="2025-10-28T00:32:28.558348606Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 28 00:32:28.558428 containerd[1625]: time="2025-10-28T00:32:28.558416978Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 28 00:32:28.558607 kubelet[2926]: E1028 00:32:28.558532 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:32:28.558607 kubelet[2926]: E1028 00:32:28.558587 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:32:28.559794 kubelet[2926]: E1028 00:32:28.559123 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjdbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6f4bdd4695-jlv4b_calico-system(96642e2b-a099-4486-a4d3-2cc6f34eac9f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:28.559877 containerd[1625]: time="2025-10-28T00:32:28.558796843Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:32:28.560615 kubelet[2926]: E1028 00:32:28.560525 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:32:28.881336 containerd[1625]: time="2025-10-28T00:32:28.881245737Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:28.881992 containerd[1625]: time="2025-10-28T00:32:28.881962307Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:32:28.882086 containerd[1625]: time="2025-10-28T00:32:28.882070025Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:32:28.882617 kubelet[2926]: E1028 00:32:28.882550 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:28.882665 kubelet[2926]: E1028 00:32:28.882628 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:28.883145 kubelet[2926]: E1028 00:32:28.882896 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjp7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64d5cc589f-kfc7q_calico-apiserver(2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:28.884592 kubelet[2926]: E1028 00:32:28.884457 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:32:29.840218 systemd[1]: Started sshd@9-139.178.70.100:22-139.178.89.65:35024.service - OpenSSH per-connection server daemon (139.178.89.65:35024). Oct 28 00:32:29.983809 sshd[5405]: Accepted publickey for core from 139.178.89.65 port 35024 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:29.984843 sshd-session[5405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:29.990444 systemd-logind[1597]: New session 11 of user core. Oct 28 00:32:30.001972 systemd[1]: Started session-11.scope - Session 11 of User core. Oct 28 00:32:30.171288 sshd[5408]: Connection closed by 139.178.89.65 port 35024 Oct 28 00:32:30.171712 sshd-session[5405]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:30.176531 systemd[1]: sshd@9-139.178.70.100:22-139.178.89.65:35024.service: Deactivated successfully. Oct 28 00:32:30.178029 systemd[1]: session-11.scope: Deactivated successfully. Oct 28 00:32:30.179064 systemd-logind[1597]: Session 11 logged out. Waiting for processes to exit. Oct 28 00:32:30.180179 systemd-logind[1597]: Removed session 11. Oct 28 00:32:31.148697 containerd[1625]: time="2025-10-28T00:32:31.148668476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 28 00:32:31.496114 containerd[1625]: time="2025-10-28T00:32:31.496075663Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:31.500976 containerd[1625]: time="2025-10-28T00:32:31.500902663Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 28 00:32:31.500976 containerd[1625]: time="2025-10-28T00:32:31.500945517Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 28 00:32:31.501192 kubelet[2926]: E1028 00:32:31.501078 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:32:31.501192 kubelet[2926]: E1028 00:32:31.501121 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:32:31.501562 kubelet[2926]: E1028 00:32:31.501221 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:71fa32c1794749f18e2004eee7d3e458,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tpmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-765696dbb4-7zljh_calico-system(a4162029-5497-4f8b-a7bf-68541ba5fac8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:31.504041 containerd[1625]: time="2025-10-28T00:32:31.504020607Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 28 00:32:31.887799 containerd[1625]: time="2025-10-28T00:32:31.885116940Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:31.890815 containerd[1625]: time="2025-10-28T00:32:31.890615878Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 28 00:32:31.890815 containerd[1625]: time="2025-10-28T00:32:31.890673268Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 28 00:32:31.890952 kubelet[2926]: E1028 00:32:31.890913 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:32:31.890996 kubelet[2926]: E1028 00:32:31.890957 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:32:31.891379 kubelet[2926]: E1028 00:32:31.891062 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-765696dbb4-7zljh_calico-system(a4162029-5497-4f8b-a7bf-68541ba5fac8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:31.892556 kubelet[2926]: E1028 00:32:31.892504 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-765696dbb4-7zljh" podUID="a4162029-5497-4f8b-a7bf-68541ba5fac8" Oct 28 00:32:34.149721 containerd[1625]: time="2025-10-28T00:32:34.149677061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:32:34.462589 containerd[1625]: time="2025-10-28T00:32:34.462544779Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:34.463037 containerd[1625]: time="2025-10-28T00:32:34.462982365Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:32:34.463071 containerd[1625]: time="2025-10-28T00:32:34.463041837Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:32:34.463227 kubelet[2926]: E1028 00:32:34.463194 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:34.463467 kubelet[2926]: E1028 00:32:34.463234 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:32:34.463467 kubelet[2926]: E1028 00:32:34.463384 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmbdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64d5cc589f-jn72s_calico-apiserver(491a933d-8deb-47d2-a1f5-45928f657a21): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:34.463987 containerd[1625]: time="2025-10-28T00:32:34.463972049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 28 00:32:34.465016 kubelet[2926]: E1028 00:32:34.465001 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:32:34.759159 containerd[1625]: time="2025-10-28T00:32:34.758920455Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:34.759618 containerd[1625]: time="2025-10-28T00:32:34.759595037Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 28 00:32:34.759731 containerd[1625]: time="2025-10-28T00:32:34.759651279Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 28 00:32:34.759857 kubelet[2926]: E1028 00:32:34.759824 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:32:34.759913 kubelet[2926]: E1028 00:32:34.759859 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:32:34.760187 kubelet[2926]: E1028 00:32:34.759946 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:34.762472 containerd[1625]: time="2025-10-28T00:32:34.762330904Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 28 00:32:35.098123 containerd[1625]: time="2025-10-28T00:32:35.098023134Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:32:35.099542 containerd[1625]: time="2025-10-28T00:32:35.099514728Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 28 00:32:35.099674 containerd[1625]: time="2025-10-28T00:32:35.099568954Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 28 00:32:35.099719 kubelet[2926]: E1028 00:32:35.099681 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:32:35.099777 kubelet[2926]: E1028 00:32:35.099729 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:32:35.099937 kubelet[2926]: E1028 00:32:35.099826 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-p9vxm_calico-system(bd5600ff-882d-4fd0-9a0a-4d9435b64027): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 28 00:32:35.101148 kubelet[2926]: E1028 00:32:35.101119 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:32:35.184433 systemd[1]: Started sshd@10-139.178.70.100:22-139.178.89.65:35036.service - OpenSSH per-connection server daemon (139.178.89.65:35036). Oct 28 00:32:35.253031 sshd[5426]: Accepted publickey for core from 139.178.89.65 port 35036 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:35.254112 sshd-session[5426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:35.259759 systemd-logind[1597]: New session 12 of user core. Oct 28 00:32:35.266762 systemd[1]: Started session-12.scope - Session 12 of User core. Oct 28 00:32:35.395908 sshd[5429]: Connection closed by 139.178.89.65 port 35036 Oct 28 00:32:35.399427 sshd-session[5426]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:35.404732 systemd[1]: Started sshd@11-139.178.70.100:22-139.178.89.65:35048.service - OpenSSH per-connection server daemon (139.178.89.65:35048). Oct 28 00:32:35.406253 systemd[1]: sshd@10-139.178.70.100:22-139.178.89.65:35036.service: Deactivated successfully. Oct 28 00:32:35.409011 systemd[1]: session-12.scope: Deactivated successfully. Oct 28 00:32:35.411500 systemd-logind[1597]: Session 12 logged out. Waiting for processes to exit. Oct 28 00:32:35.412656 systemd-logind[1597]: Removed session 12. Oct 28 00:32:35.503590 sshd[5438]: Accepted publickey for core from 139.178.89.65 port 35048 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:35.506309 sshd-session[5438]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:35.513607 systemd-logind[1597]: New session 13 of user core. Oct 28 00:32:35.521001 systemd[1]: Started session-13.scope - Session 13 of User core. Oct 28 00:32:36.075459 sshd[5444]: Connection closed by 139.178.89.65 port 35048 Oct 28 00:32:36.084367 sshd-session[5438]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:36.087696 systemd[1]: Started sshd@12-139.178.70.100:22-139.178.89.65:35776.service - OpenSSH per-connection server daemon (139.178.89.65:35776). Oct 28 00:32:36.088036 systemd[1]: sshd@11-139.178.70.100:22-139.178.89.65:35048.service: Deactivated successfully. Oct 28 00:32:36.089722 systemd[1]: session-13.scope: Deactivated successfully. Oct 28 00:32:36.090504 systemd-logind[1597]: Session 13 logged out. Waiting for processes to exit. Oct 28 00:32:36.092001 systemd-logind[1597]: Removed session 13. Oct 28 00:32:36.296283 sshd[5451]: Accepted publickey for core from 139.178.89.65 port 35776 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:36.296185 sshd-session[5451]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:36.303632 systemd-logind[1597]: New session 14 of user core. Oct 28 00:32:36.308706 systemd[1]: Started session-14.scope - Session 14 of User core. Oct 28 00:32:36.453269 sshd[5457]: Connection closed by 139.178.89.65 port 35776 Oct 28 00:32:36.453198 sshd-session[5451]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:36.457370 systemd-logind[1597]: Session 14 logged out. Waiting for processes to exit. Oct 28 00:32:36.457483 systemd[1]: sshd@12-139.178.70.100:22-139.178.89.65:35776.service: Deactivated successfully. Oct 28 00:32:36.458892 systemd[1]: session-14.scope: Deactivated successfully. Oct 28 00:32:36.460107 systemd-logind[1597]: Removed session 14. Oct 28 00:32:37.148523 kubelet[2926]: E1028 00:32:37.148489 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:32:38.148930 kubelet[2926]: E1028 00:32:38.148889 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:32:39.516923 containerd[1625]: time="2025-10-28T00:32:39.516872535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\" id:\"00ee1dcd96de880651cd7ed48618634af0ed4eb6d31b31c5a43a589f5a9ee8c5\" pid:5482 exit_status:1 exited_at:{seconds:1761611559 nanos:485322538}" Oct 28 00:32:41.463727 systemd[1]: Started sshd@13-139.178.70.100:22-139.178.89.65:35786.service - OpenSSH per-connection server daemon (139.178.89.65:35786). Oct 28 00:32:41.650888 sshd[5497]: Accepted publickey for core from 139.178.89.65 port 35786 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:41.651907 sshd-session[5497]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:41.655633 systemd-logind[1597]: New session 15 of user core. Oct 28 00:32:41.661912 systemd[1]: Started session-15.scope - Session 15 of User core. Oct 28 00:32:41.906765 sshd[5500]: Connection closed by 139.178.89.65 port 35786 Oct 28 00:32:41.907372 sshd-session[5497]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:41.913788 systemd[1]: sshd@13-139.178.70.100:22-139.178.89.65:35786.service: Deactivated successfully. Oct 28 00:32:41.916338 systemd[1]: session-15.scope: Deactivated successfully. Oct 28 00:32:41.918034 systemd-logind[1597]: Session 15 logged out. Waiting for processes to exit. Oct 28 00:32:41.919331 systemd-logind[1597]: Removed session 15. Oct 28 00:32:42.161494 kubelet[2926]: E1028 00:32:42.161415 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:32:43.147618 kubelet[2926]: E1028 00:32:43.147555 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:32:45.151832 kubelet[2926]: E1028 00:32:45.150954 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-765696dbb4-7zljh" podUID="a4162029-5497-4f8b-a7bf-68541ba5fac8" Oct 28 00:32:46.916717 systemd[1]: Started sshd@14-139.178.70.100:22-139.178.89.65:47912.service - OpenSSH per-connection server daemon (139.178.89.65:47912). Oct 28 00:32:47.036588 sshd[5512]: Accepted publickey for core from 139.178.89.65 port 47912 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:47.039294 sshd-session[5512]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:47.043875 systemd-logind[1597]: New session 16 of user core. Oct 28 00:32:47.047678 systemd[1]: Started session-16.scope - Session 16 of User core. Oct 28 00:32:47.166137 kubelet[2926]: E1028 00:32:47.166087 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:32:47.175481 sshd[5515]: Connection closed by 139.178.89.65 port 47912 Oct 28 00:32:47.177315 sshd-session[5512]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:47.182014 systemd-logind[1597]: Session 16 logged out. Waiting for processes to exit. Oct 28 00:32:47.182198 systemd[1]: sshd@14-139.178.70.100:22-139.178.89.65:47912.service: Deactivated successfully. Oct 28 00:32:47.183830 systemd[1]: session-16.scope: Deactivated successfully. Oct 28 00:32:47.185457 systemd-logind[1597]: Removed session 16. Oct 28 00:32:50.150871 kubelet[2926]: E1028 00:32:50.150813 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:32:52.148322 kubelet[2926]: E1028 00:32:52.147973 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:32:52.185807 systemd[1]: Started sshd@15-139.178.70.100:22-139.178.89.65:47928.service - OpenSSH per-connection server daemon (139.178.89.65:47928). Oct 28 00:32:52.237468 sshd[5527]: Accepted publickey for core from 139.178.89.65 port 47928 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:52.238387 sshd-session[5527]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:52.241860 systemd-logind[1597]: New session 17 of user core. Oct 28 00:32:52.243688 systemd[1]: Started session-17.scope - Session 17 of User core. Oct 28 00:32:52.380752 sshd[5530]: Connection closed by 139.178.89.65 port 47928 Oct 28 00:32:52.381087 sshd-session[5527]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:52.383530 systemd[1]: sshd@15-139.178.70.100:22-139.178.89.65:47928.service: Deactivated successfully. Oct 28 00:32:52.384753 systemd[1]: session-17.scope: Deactivated successfully. Oct 28 00:32:52.385554 systemd-logind[1597]: Session 17 logged out. Waiting for processes to exit. Oct 28 00:32:52.386539 systemd-logind[1597]: Removed session 17. Oct 28 00:32:53.148925 kubelet[2926]: E1028 00:32:53.148900 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:32:55.148018 kubelet[2926]: E1028 00:32:55.147807 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:32:56.152289 kubelet[2926]: E1028 00:32:56.152251 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:32:57.391799 systemd[1]: Started sshd@16-139.178.70.100:22-139.178.89.65:52500.service - OpenSSH per-connection server daemon (139.178.89.65:52500). Oct 28 00:32:57.436059 sshd[5546]: Accepted publickey for core from 139.178.89.65 port 52500 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:57.437079 sshd-session[5546]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:57.440811 systemd-logind[1597]: New session 18 of user core. Oct 28 00:32:57.447723 systemd[1]: Started session-18.scope - Session 18 of User core. Oct 28 00:32:57.569716 sshd[5549]: Connection closed by 139.178.89.65 port 52500 Oct 28 00:32:57.570076 sshd-session[5546]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:57.579810 systemd[1]: sshd@16-139.178.70.100:22-139.178.89.65:52500.service: Deactivated successfully. Oct 28 00:32:57.583009 systemd[1]: session-18.scope: Deactivated successfully. Oct 28 00:32:57.584985 systemd-logind[1597]: Session 18 logged out. Waiting for processes to exit. Oct 28 00:32:57.588765 systemd[1]: Started sshd@17-139.178.70.100:22-139.178.89.65:52502.service - OpenSSH per-connection server daemon (139.178.89.65:52502). Oct 28 00:32:57.589233 systemd-logind[1597]: Removed session 18. Oct 28 00:32:57.638887 sshd[5560]: Accepted publickey for core from 139.178.89.65 port 52502 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:57.639954 sshd-session[5560]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:57.646245 systemd-logind[1597]: New session 19 of user core. Oct 28 00:32:57.649716 systemd[1]: Started session-19.scope - Session 19 of User core. Oct 28 00:32:58.016134 sshd[5563]: Connection closed by 139.178.89.65 port 52502 Oct 28 00:32:58.017810 sshd-session[5560]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:58.023240 systemd[1]: sshd@17-139.178.70.100:22-139.178.89.65:52502.service: Deactivated successfully. Oct 28 00:32:58.025544 systemd[1]: session-19.scope: Deactivated successfully. Oct 28 00:32:58.026912 systemd-logind[1597]: Session 19 logged out. Waiting for processes to exit. Oct 28 00:32:58.032623 systemd[1]: Started sshd@18-139.178.70.100:22-139.178.89.65:52512.service - OpenSSH per-connection server daemon (139.178.89.65:52512). Oct 28 00:32:58.033619 systemd-logind[1597]: Removed session 19. Oct 28 00:32:58.131974 sshd[5573]: Accepted publickey for core from 139.178.89.65 port 52512 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:58.133242 sshd-session[5573]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:58.138882 systemd-logind[1597]: New session 20 of user core. Oct 28 00:32:58.145741 systemd[1]: Started session-20.scope - Session 20 of User core. Oct 28 00:32:58.765423 sshd[5576]: Connection closed by 139.178.89.65 port 52512 Oct 28 00:32:58.767720 sshd-session[5573]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:58.776921 systemd[1]: Started sshd@19-139.178.70.100:22-139.178.89.65:52516.service - OpenSSH per-connection server daemon (139.178.89.65:52516). Oct 28 00:32:58.777213 systemd[1]: sshd@18-139.178.70.100:22-139.178.89.65:52512.service: Deactivated successfully. Oct 28 00:32:58.784795 systemd[1]: session-20.scope: Deactivated successfully. Oct 28 00:32:58.787248 systemd-logind[1597]: Session 20 logged out. Waiting for processes to exit. Oct 28 00:32:58.789881 systemd-logind[1597]: Removed session 20. Oct 28 00:32:58.837877 sshd[5587]: Accepted publickey for core from 139.178.89.65 port 52516 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:58.838938 sshd-session[5587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:58.844685 systemd-logind[1597]: New session 21 of user core. Oct 28 00:32:58.847675 systemd[1]: Started session-21.scope - Session 21 of User core. Oct 28 00:32:59.150833 kubelet[2926]: E1028 00:32:59.150109 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21" Oct 28 00:32:59.152282 kubelet[2926]: E1028 00:32:59.151119 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-765696dbb4-7zljh" podUID="a4162029-5497-4f8b-a7bf-68541ba5fac8" Oct 28 00:32:59.407740 sshd[5595]: Connection closed by 139.178.89.65 port 52516 Oct 28 00:32:59.407642 sshd-session[5587]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:59.416394 systemd[1]: sshd@19-139.178.70.100:22-139.178.89.65:52516.service: Deactivated successfully. Oct 28 00:32:59.418641 systemd[1]: session-21.scope: Deactivated successfully. Oct 28 00:32:59.419316 systemd-logind[1597]: Session 21 logged out. Waiting for processes to exit. Oct 28 00:32:59.421365 systemd[1]: Started sshd@20-139.178.70.100:22-139.178.89.65:52528.service - OpenSSH per-connection server daemon (139.178.89.65:52528). Oct 28 00:32:59.425519 systemd-logind[1597]: Removed session 21. Oct 28 00:32:59.477455 sshd[5607]: Accepted publickey for core from 139.178.89.65 port 52528 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:32:59.478383 sshd-session[5607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:32:59.481086 systemd-logind[1597]: New session 22 of user core. Oct 28 00:32:59.489714 systemd[1]: Started session-22.scope - Session 22 of User core. Oct 28 00:32:59.601258 sshd[5610]: Connection closed by 139.178.89.65 port 52528 Oct 28 00:32:59.602083 sshd-session[5607]: pam_unix(sshd:session): session closed for user core Oct 28 00:32:59.604348 systemd[1]: sshd@20-139.178.70.100:22-139.178.89.65:52528.service: Deactivated successfully. Oct 28 00:32:59.606820 systemd[1]: session-22.scope: Deactivated successfully. Oct 28 00:32:59.608608 systemd-logind[1597]: Session 22 logged out. Waiting for processes to exit. Oct 28 00:32:59.609651 systemd-logind[1597]: Removed session 22. Oct 28 00:33:02.149835 kubelet[2926]: E1028 00:33:02.149530 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p9vxm" podUID="bd5600ff-882d-4fd0-9a0a-4d9435b64027" Oct 28 00:33:04.612004 systemd[1]: Started sshd@21-139.178.70.100:22-139.178.89.65:52530.service - OpenSSH per-connection server daemon (139.178.89.65:52530). Oct 28 00:33:04.673683 sshd[5625]: Accepted publickey for core from 139.178.89.65 port 52530 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:33:04.674323 sshd-session[5625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:33:04.677722 systemd-logind[1597]: New session 23 of user core. Oct 28 00:33:04.681687 systemd[1]: Started session-23.scope - Session 23 of User core. Oct 28 00:33:04.802997 sshd[5628]: Connection closed by 139.178.89.65 port 52530 Oct 28 00:33:04.803479 sshd-session[5625]: pam_unix(sshd:session): session closed for user core Oct 28 00:33:04.807070 systemd-logind[1597]: Session 23 logged out. Waiting for processes to exit. Oct 28 00:33:04.807139 systemd[1]: sshd@21-139.178.70.100:22-139.178.89.65:52530.service: Deactivated successfully. Oct 28 00:33:04.808799 systemd[1]: session-23.scope: Deactivated successfully. Oct 28 00:33:04.812636 systemd-logind[1597]: Removed session 23. Oct 28 00:33:06.149363 kubelet[2926]: E1028 00:33:06.149329 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f4bdd4695-jlv4b" podUID="96642e2b-a099-4486-a4d3-2cc6f34eac9f" Oct 28 00:33:07.149397 containerd[1625]: time="2025-10-28T00:33:07.149360913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 28 00:33:07.525436 containerd[1625]: time="2025-10-28T00:33:07.521521867Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:33:07.525782 containerd[1625]: time="2025-10-28T00:33:07.525751244Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 28 00:33:07.526039 containerd[1625]: time="2025-10-28T00:33:07.525812375Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 28 00:33:07.526934 kubelet[2926]: E1028 00:33:07.526210 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:33:07.526934 kubelet[2926]: E1028 00:33:07.526255 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:33:07.526934 kubelet[2926]: E1028 00:33:07.526368 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6blqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-x6j9p_calico-system(a3d67cc9-a4af-4a25-892e-b5ffc390b89f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 28 00:33:07.527670 kubelet[2926]: E1028 00:33:07.527514 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-x6j9p" podUID="a3d67cc9-a4af-4a25-892e-b5ffc390b89f" Oct 28 00:33:08.149293 containerd[1625]: time="2025-10-28T00:33:08.149265194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:33:08.522818 containerd[1625]: time="2025-10-28T00:33:08.522779901Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:33:08.523082 containerd[1625]: time="2025-10-28T00:33:08.523067355Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:33:08.523182 containerd[1625]: time="2025-10-28T00:33:08.523120320Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:33:08.523226 kubelet[2926]: E1028 00:33:08.523203 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:33:08.523266 kubelet[2926]: E1028 00:33:08.523239 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:33:08.523585 kubelet[2926]: E1028 00:33:08.523356 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9j5lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64fb65988f-xxdqr_calico-apiserver(5296fefe-d676-448c-ac61-6435527489ed): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:33:08.525252 kubelet[2926]: E1028 00:33:08.525235 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64fb65988f-xxdqr" podUID="5296fefe-d676-448c-ac61-6435527489ed" Oct 28 00:33:09.505757 containerd[1625]: time="2025-10-28T00:33:09.505729281Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8c5da9c73044d22354101c19d26db3e46c934b6d630f8a890d5eb6ddca1cec6f\" id:\"c396d5bb8f4e5bf657ec8b702f951b0fc321f322b1a73d36831288bf9b473859\" pid:5657 exited_at:{seconds:1761611589 nanos:505495900}" Oct 28 00:33:09.812238 systemd[1]: Started sshd@22-139.178.70.100:22-139.178.89.65:47002.service - OpenSSH per-connection server daemon (139.178.89.65:47002). Oct 28 00:33:09.940147 sshd[5669]: Accepted publickey for core from 139.178.89.65 port 47002 ssh2: RSA SHA256:fNZ76zf85LC4oUO7gzgKfc2yaB17DBEcT6LZSeIfEpg Oct 28 00:33:09.942462 sshd-session[5669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:33:09.947443 systemd-logind[1597]: New session 24 of user core. Oct 28 00:33:09.952732 systemd[1]: Started session-24.scope - Session 24 of User core. Oct 28 00:33:10.141039 sshd[5672]: Connection closed by 139.178.89.65 port 47002 Oct 28 00:33:10.141870 sshd-session[5669]: pam_unix(sshd:session): session closed for user core Oct 28 00:33:10.143778 systemd[1]: sshd@22-139.178.70.100:22-139.178.89.65:47002.service: Deactivated successfully. Oct 28 00:33:10.145013 systemd[1]: session-24.scope: Deactivated successfully. Oct 28 00:33:10.146196 systemd-logind[1597]: Session 24 logged out. Waiting for processes to exit. Oct 28 00:33:10.146845 systemd-logind[1597]: Removed session 24. Oct 28 00:33:11.148178 containerd[1625]: time="2025-10-28T00:33:11.147998469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:33:11.490412 containerd[1625]: time="2025-10-28T00:33:11.489783248Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:33:11.490641 containerd[1625]: time="2025-10-28T00:33:11.490624823Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:33:11.490734 containerd[1625]: time="2025-10-28T00:33:11.490683140Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:33:11.491257 kubelet[2926]: E1028 00:33:11.491231 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:33:11.491775 kubelet[2926]: E1028 00:33:11.491470 2926 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:33:11.491855 kubelet[2926]: E1028 00:33:11.491828 2926 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjp7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-64d5cc589f-kfc7q_calico-apiserver(2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:33:11.493191 kubelet[2926]: E1028 00:33:11.493165 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-kfc7q" podUID="2c04f8c1-bc0d-4ee4-8a41-3a74344e8ece" Oct 28 00:33:12.148233 kubelet[2926]: E1028 00:33:12.148101 2926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-64d5cc589f-jn72s" podUID="491a933d-8deb-47d2-a1f5-45928f657a21"