Oct 28 00:12:40.947999 kernel: Linux version 6.12.54-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Mon Oct 27 22:07:42 -00 2025 Oct 28 00:12:40.948016 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=bb8cbc137ff563234eef33bdd51a5c9ee67c90d62b83654276e2a4d312ac5ee1 Oct 28 00:12:40.948039 kernel: Disabled fast string operations Oct 28 00:12:40.948047 kernel: BIOS-provided physical RAM map: Oct 28 00:12:40.948055 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Oct 28 00:12:40.948060 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Oct 28 00:12:40.948068 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Oct 28 00:12:40.948073 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Oct 28 00:12:40.948078 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Oct 28 00:12:40.948082 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Oct 28 00:12:40.948087 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Oct 28 00:12:40.948092 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Oct 28 00:12:40.948096 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Oct 28 00:12:40.948101 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Oct 28 00:12:40.948108 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Oct 28 00:12:40.948113 kernel: NX (Execute Disable) protection: active Oct 28 00:12:40.948118 kernel: APIC: Static calls initialized Oct 28 00:12:40.948123 kernel: SMBIOS 2.7 present. Oct 28 00:12:40.948129 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Oct 28 00:12:40.948134 kernel: DMI: Memory slots populated: 1/128 Oct 28 00:12:40.948140 kernel: vmware: hypercall mode: 0x00 Oct 28 00:12:40.948145 kernel: Hypervisor detected: VMware Oct 28 00:12:40.948151 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Oct 28 00:12:40.948156 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Oct 28 00:12:40.948161 kernel: vmware: using clock offset of 3019040660 ns Oct 28 00:12:40.948166 kernel: tsc: Detected 3408.000 MHz processor Oct 28 00:12:40.948172 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Oct 28 00:12:40.948178 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Oct 28 00:12:40.948183 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Oct 28 00:12:40.948190 kernel: total RAM covered: 3072M Oct 28 00:12:40.948195 kernel: Found optimal setting for mtrr clean up Oct 28 00:12:40.948202 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Oct 28 00:12:40.948207 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Oct 28 00:12:40.948213 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Oct 28 00:12:40.948218 kernel: Using GB pages for direct mapping Oct 28 00:12:40.948223 kernel: ACPI: Early table checksum verification disabled Oct 28 00:12:40.948229 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Oct 28 00:12:40.948235 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Oct 28 00:12:40.948241 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Oct 28 00:12:40.948246 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Oct 28 00:12:40.948254 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Oct 28 00:12:40.948259 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Oct 28 00:12:40.948266 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Oct 28 00:12:40.948272 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Oct 28 00:12:40.948278 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Oct 28 00:12:40.948283 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Oct 28 00:12:40.948289 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Oct 28 00:12:40.948295 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Oct 28 00:12:40.948302 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Oct 28 00:12:40.948307 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Oct 28 00:12:40.948313 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Oct 28 00:12:40.948318 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Oct 28 00:12:40.948324 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Oct 28 00:12:40.948329 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Oct 28 00:12:40.948335 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Oct 28 00:12:40.948341 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Oct 28 00:12:40.948347 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Oct 28 00:12:40.948353 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Oct 28 00:12:40.948359 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Oct 28 00:12:40.948365 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Oct 28 00:12:40.948370 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Oct 28 00:12:40.948376 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Oct 28 00:12:40.948382 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Oct 28 00:12:40.948389 kernel: Zone ranges: Oct 28 00:12:40.948395 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Oct 28 00:12:40.948400 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Oct 28 00:12:40.948406 kernel: Normal empty Oct 28 00:12:40.948412 kernel: Device empty Oct 28 00:12:40.948417 kernel: Movable zone start for each node Oct 28 00:12:40.948423 kernel: Early memory node ranges Oct 28 00:12:40.948429 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Oct 28 00:12:40.948435 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Oct 28 00:12:40.948441 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Oct 28 00:12:40.948446 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Oct 28 00:12:40.948452 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Oct 28 00:12:40.948458 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Oct 28 00:12:40.948463 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Oct 28 00:12:40.948469 kernel: ACPI: PM-Timer IO Port: 0x1008 Oct 28 00:12:40.948475 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Oct 28 00:12:40.948481 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Oct 28 00:12:40.948487 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Oct 28 00:12:40.948493 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Oct 28 00:12:40.948498 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Oct 28 00:12:40.948504 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Oct 28 00:12:40.948509 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Oct 28 00:12:40.948515 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Oct 28 00:12:40.948520 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Oct 28 00:12:40.948527 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Oct 28 00:12:40.948532 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Oct 28 00:12:40.948538 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Oct 28 00:12:40.948543 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Oct 28 00:12:40.948549 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Oct 28 00:12:40.948554 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Oct 28 00:12:40.948560 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Oct 28 00:12:40.948565 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Oct 28 00:12:40.948572 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Oct 28 00:12:40.948577 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Oct 28 00:12:40.948583 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Oct 28 00:12:40.948588 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Oct 28 00:12:40.948594 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Oct 28 00:12:40.948599 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Oct 28 00:12:40.948605 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Oct 28 00:12:40.948610 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Oct 28 00:12:40.948617 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Oct 28 00:12:40.948622 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Oct 28 00:12:40.948628 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Oct 28 00:12:40.948633 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Oct 28 00:12:40.948638 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Oct 28 00:12:40.948644 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Oct 28 00:12:40.948650 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Oct 28 00:12:40.948655 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Oct 28 00:12:40.948661 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Oct 28 00:12:40.948667 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Oct 28 00:12:40.948672 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Oct 28 00:12:40.948678 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Oct 28 00:12:40.948683 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Oct 28 00:12:40.948689 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Oct 28 00:12:40.948695 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Oct 28 00:12:40.948705 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Oct 28 00:12:40.948711 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Oct 28 00:12:40.948716 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Oct 28 00:12:40.948722 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Oct 28 00:12:40.948729 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Oct 28 00:12:40.948735 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Oct 28 00:12:40.948741 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Oct 28 00:12:40.948746 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Oct 28 00:12:40.948753 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Oct 28 00:12:40.948759 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Oct 28 00:12:40.948765 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Oct 28 00:12:40.948771 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Oct 28 00:12:40.948777 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Oct 28 00:12:40.948783 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Oct 28 00:12:40.948788 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Oct 28 00:12:40.948794 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Oct 28 00:12:40.948801 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Oct 28 00:12:40.948807 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Oct 28 00:12:40.948813 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Oct 28 00:12:40.948819 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Oct 28 00:12:40.948825 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Oct 28 00:12:40.948831 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Oct 28 00:12:40.948837 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Oct 28 00:12:40.948842 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Oct 28 00:12:40.948849 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Oct 28 00:12:40.948855 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Oct 28 00:12:40.948861 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Oct 28 00:12:40.948867 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Oct 28 00:12:40.948873 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Oct 28 00:12:40.948879 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Oct 28 00:12:40.948885 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Oct 28 00:12:40.948891 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Oct 28 00:12:40.948896 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Oct 28 00:12:40.948903 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Oct 28 00:12:40.948909 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Oct 28 00:12:40.948914 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Oct 28 00:12:40.948921 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Oct 28 00:12:40.948926 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Oct 28 00:12:40.948932 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Oct 28 00:12:40.948938 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Oct 28 00:12:40.948944 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Oct 28 00:12:40.948951 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Oct 28 00:12:40.948956 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Oct 28 00:12:40.948962 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Oct 28 00:12:40.948968 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Oct 28 00:12:40.948974 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Oct 28 00:12:40.948980 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Oct 28 00:12:40.948986 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Oct 28 00:12:40.948991 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Oct 28 00:12:40.948998 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Oct 28 00:12:40.949005 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Oct 28 00:12:40.949010 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Oct 28 00:12:40.949016 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Oct 28 00:12:40.949055 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Oct 28 00:12:40.949064 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Oct 28 00:12:40.949070 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Oct 28 00:12:40.949075 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Oct 28 00:12:40.949084 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Oct 28 00:12:40.949089 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Oct 28 00:12:40.949096 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Oct 28 00:12:40.949105 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Oct 28 00:12:40.949115 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Oct 28 00:12:40.949125 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Oct 28 00:12:40.949134 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Oct 28 00:12:40.949143 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Oct 28 00:12:40.949151 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Oct 28 00:12:40.949163 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Oct 28 00:12:40.949173 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Oct 28 00:12:40.949183 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Oct 28 00:12:40.949193 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Oct 28 00:12:40.949205 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Oct 28 00:12:40.949215 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Oct 28 00:12:40.949226 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Oct 28 00:12:40.949236 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Oct 28 00:12:40.949249 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Oct 28 00:12:40.949423 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Oct 28 00:12:40.949434 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Oct 28 00:12:40.949440 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Oct 28 00:12:40.949446 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Oct 28 00:12:40.949452 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Oct 28 00:12:40.949458 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Oct 28 00:12:40.949464 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Oct 28 00:12:40.949472 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Oct 28 00:12:40.949478 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Oct 28 00:12:40.949484 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Oct 28 00:12:40.949489 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Oct 28 00:12:40.949495 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Oct 28 00:12:40.949502 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Oct 28 00:12:40.949508 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Oct 28 00:12:40.949514 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Oct 28 00:12:40.949521 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Oct 28 00:12:40.949528 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Oct 28 00:12:40.949533 kernel: TSC deadline timer available Oct 28 00:12:40.949540 kernel: CPU topo: Max. logical packages: 128 Oct 28 00:12:40.949546 kernel: CPU topo: Max. logical dies: 128 Oct 28 00:12:40.949552 kernel: CPU topo: Max. dies per package: 1 Oct 28 00:12:40.949558 kernel: CPU topo: Max. threads per core: 1 Oct 28 00:12:40.949564 kernel: CPU topo: Num. cores per package: 1 Oct 28 00:12:40.949570 kernel: CPU topo: Num. threads per package: 1 Oct 28 00:12:40.949576 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Oct 28 00:12:40.949583 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Oct 28 00:12:40.949589 kernel: Booting paravirtualized kernel on VMware hypervisor Oct 28 00:12:40.949596 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Oct 28 00:12:40.949602 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Oct 28 00:12:40.949608 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Oct 28 00:12:40.949614 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Oct 28 00:12:40.949622 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Oct 28 00:12:40.949628 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Oct 28 00:12:40.949634 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Oct 28 00:12:40.949640 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Oct 28 00:12:40.949647 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Oct 28 00:12:40.949652 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Oct 28 00:12:40.949659 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Oct 28 00:12:40.949666 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Oct 28 00:12:40.949672 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Oct 28 00:12:40.949678 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Oct 28 00:12:40.949684 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Oct 28 00:12:40.949690 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Oct 28 00:12:40.949696 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Oct 28 00:12:40.949702 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Oct 28 00:12:40.949709 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Oct 28 00:12:40.949715 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Oct 28 00:12:40.949722 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=bb8cbc137ff563234eef33bdd51a5c9ee67c90d62b83654276e2a4d312ac5ee1 Oct 28 00:12:40.949729 kernel: random: crng init done Oct 28 00:12:40.949735 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Oct 28 00:12:40.949741 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Oct 28 00:12:40.949748 kernel: printk: log_buf_len min size: 262144 bytes Oct 28 00:12:40.949754 kernel: printk: log_buf_len: 1048576 bytes Oct 28 00:12:40.949760 kernel: printk: early log buf free: 245688(93%) Oct 28 00:12:40.949766 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 28 00:12:40.949772 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Oct 28 00:12:40.949779 kernel: Fallback order for Node 0: 0 Oct 28 00:12:40.949785 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Oct 28 00:12:40.949791 kernel: Policy zone: DMA32 Oct 28 00:12:40.949799 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Oct 28 00:12:40.949805 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Oct 28 00:12:40.949811 kernel: ftrace: allocating 40092 entries in 157 pages Oct 28 00:12:40.949817 kernel: ftrace: allocated 157 pages with 5 groups Oct 28 00:12:40.949823 kernel: Dynamic Preempt: voluntary Oct 28 00:12:40.949829 kernel: rcu: Preemptible hierarchical RCU implementation. Oct 28 00:12:40.949836 kernel: rcu: RCU event tracing is enabled. Oct 28 00:12:40.949843 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Oct 28 00:12:40.949849 kernel: Trampoline variant of Tasks RCU enabled. Oct 28 00:12:40.949855 kernel: Rude variant of Tasks RCU enabled. Oct 28 00:12:40.949861 kernel: Tracing variant of Tasks RCU enabled. Oct 28 00:12:40.949867 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Oct 28 00:12:40.949873 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Oct 28 00:12:40.949879 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 28 00:12:40.949885 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 28 00:12:40.949893 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 28 00:12:40.949899 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Oct 28 00:12:40.949905 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Oct 28 00:12:40.949911 kernel: Console: colour VGA+ 80x25 Oct 28 00:12:40.949917 kernel: printk: legacy console [tty0] enabled Oct 28 00:12:40.949923 kernel: printk: legacy console [ttyS0] enabled Oct 28 00:12:40.949929 kernel: ACPI: Core revision 20240827 Oct 28 00:12:40.949937 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Oct 28 00:12:40.949943 kernel: APIC: Switch to symmetric I/O mode setup Oct 28 00:12:40.949949 kernel: x2apic enabled Oct 28 00:12:40.949955 kernel: APIC: Switched APIC routing to: physical x2apic Oct 28 00:12:40.949962 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Oct 28 00:12:40.949968 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Oct 28 00:12:40.949974 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Oct 28 00:12:40.949982 kernel: Disabled fast string operations Oct 28 00:12:40.949988 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Oct 28 00:12:40.949994 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Oct 28 00:12:40.950000 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Oct 28 00:12:40.950006 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Oct 28 00:12:40.950012 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Oct 28 00:12:40.950019 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Oct 28 00:12:40.950035 kernel: RETBleed: Mitigation: Enhanced IBRS Oct 28 00:12:40.950044 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Oct 28 00:12:40.950051 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Oct 28 00:12:40.950057 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Oct 28 00:12:40.950063 kernel: SRBDS: Unknown: Dependent on hypervisor status Oct 28 00:12:40.950069 kernel: GDS: Unknown: Dependent on hypervisor status Oct 28 00:12:40.950075 kernel: active return thunk: its_return_thunk Oct 28 00:12:40.950081 kernel: ITS: Mitigation: Aligned branch/return thunks Oct 28 00:12:40.950088 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Oct 28 00:12:40.950095 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Oct 28 00:12:40.950101 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Oct 28 00:12:40.950107 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Oct 28 00:12:40.950113 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Oct 28 00:12:40.950119 kernel: Freeing SMP alternatives memory: 32K Oct 28 00:12:40.950126 kernel: pid_max: default: 131072 minimum: 1024 Oct 28 00:12:40.950133 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Oct 28 00:12:40.950139 kernel: landlock: Up and running. Oct 28 00:12:40.950145 kernel: SELinux: Initializing. Oct 28 00:12:40.950151 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 28 00:12:40.950157 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 28 00:12:40.950164 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Oct 28 00:12:40.950170 kernel: Performance Events: Skylake events, core PMU driver. Oct 28 00:12:40.950178 kernel: core: CPUID marked event: 'cpu cycles' unavailable Oct 28 00:12:40.950184 kernel: core: CPUID marked event: 'instructions' unavailable Oct 28 00:12:40.950190 kernel: core: CPUID marked event: 'bus cycles' unavailable Oct 28 00:12:40.950196 kernel: core: CPUID marked event: 'cache references' unavailable Oct 28 00:12:40.950202 kernel: core: CPUID marked event: 'cache misses' unavailable Oct 28 00:12:40.950208 kernel: core: CPUID marked event: 'branch instructions' unavailable Oct 28 00:12:40.950214 kernel: core: CPUID marked event: 'branch misses' unavailable Oct 28 00:12:40.950221 kernel: ... version: 1 Oct 28 00:12:40.950228 kernel: ... bit width: 48 Oct 28 00:12:40.950234 kernel: ... generic registers: 4 Oct 28 00:12:40.950240 kernel: ... value mask: 0000ffffffffffff Oct 28 00:12:40.950246 kernel: ... max period: 000000007fffffff Oct 28 00:12:40.950252 kernel: ... fixed-purpose events: 0 Oct 28 00:12:40.950258 kernel: ... event mask: 000000000000000f Oct 28 00:12:40.950266 kernel: signal: max sigframe size: 1776 Oct 28 00:12:40.950272 kernel: rcu: Hierarchical SRCU implementation. Oct 28 00:12:40.950279 kernel: rcu: Max phase no-delay instances is 400. Oct 28 00:12:40.950285 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Oct 28 00:12:40.950291 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Oct 28 00:12:40.950297 kernel: smp: Bringing up secondary CPUs ... Oct 28 00:12:40.950303 kernel: smpboot: x86: Booting SMP configuration: Oct 28 00:12:40.950310 kernel: .... node #0, CPUs: #1 Oct 28 00:12:40.950316 kernel: Disabled fast string operations Oct 28 00:12:40.950323 kernel: smp: Brought up 1 node, 2 CPUs Oct 28 00:12:40.950329 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Oct 28 00:12:40.950335 kernel: Memory: 1946740K/2096628K available (14336K kernel code, 2443K rwdata, 26064K rodata, 15960K init, 2084K bss, 138504K reserved, 0K cma-reserved) Oct 28 00:12:40.950342 kernel: devtmpfs: initialized Oct 28 00:12:40.950348 kernel: x86/mm: Memory block size: 128MB Oct 28 00:12:40.950355 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Oct 28 00:12:40.950362 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Oct 28 00:12:40.950368 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Oct 28 00:12:40.950374 kernel: pinctrl core: initialized pinctrl subsystem Oct 28 00:12:40.950380 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Oct 28 00:12:40.950386 kernel: audit: initializing netlink subsys (disabled) Oct 28 00:12:40.950392 kernel: audit: type=2000 audit(1761610357.277:1): state=initialized audit_enabled=0 res=1 Oct 28 00:12:40.950400 kernel: thermal_sys: Registered thermal governor 'step_wise' Oct 28 00:12:40.950406 kernel: thermal_sys: Registered thermal governor 'user_space' Oct 28 00:12:40.950412 kernel: cpuidle: using governor menu Oct 28 00:12:40.950418 kernel: Simple Boot Flag at 0x36 set to 0x80 Oct 28 00:12:40.950424 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Oct 28 00:12:40.950430 kernel: dca service started, version 1.12.1 Oct 28 00:12:40.950437 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Oct 28 00:12:40.950450 kernel: PCI: Using configuration type 1 for base access Oct 28 00:12:40.950458 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Oct 28 00:12:40.950464 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Oct 28 00:12:40.950471 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Oct 28 00:12:40.950477 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Oct 28 00:12:40.950483 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Oct 28 00:12:40.950490 kernel: ACPI: Added _OSI(Module Device) Oct 28 00:12:40.950496 kernel: ACPI: Added _OSI(Processor Device) Oct 28 00:12:40.950504 kernel: ACPI: Added _OSI(Processor Aggregator Device) Oct 28 00:12:40.950511 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Oct 28 00:12:40.950517 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Oct 28 00:12:40.950523 kernel: ACPI: Interpreter enabled Oct 28 00:12:40.950530 kernel: ACPI: PM: (supports S0 S1 S5) Oct 28 00:12:40.950536 kernel: ACPI: Using IOAPIC for interrupt routing Oct 28 00:12:40.950542 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Oct 28 00:12:40.950550 kernel: PCI: Using E820 reservations for host bridge windows Oct 28 00:12:40.950556 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Oct 28 00:12:40.950563 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Oct 28 00:12:40.950662 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Oct 28 00:12:40.950733 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Oct 28 00:12:40.950799 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Oct 28 00:12:40.950811 kernel: PCI host bridge to bus 0000:00 Oct 28 00:12:40.950878 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Oct 28 00:12:40.950939 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Oct 28 00:12:40.950997 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Oct 28 00:12:40.951349 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Oct 28 00:12:40.951415 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Oct 28 00:12:40.951723 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Oct 28 00:12:40.951803 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Oct 28 00:12:40.951877 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Oct 28 00:12:40.951945 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 28 00:12:40.952018 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Oct 28 00:12:40.952106 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Oct 28 00:12:40.952177 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Oct 28 00:12:40.952243 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Oct 28 00:12:40.952312 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Oct 28 00:12:40.952378 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Oct 28 00:12:40.952443 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Oct 28 00:12:40.952512 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Oct 28 00:12:40.952577 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Oct 28 00:12:40.952645 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Oct 28 00:12:40.952715 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Oct 28 00:12:40.952782 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Oct 28 00:12:40.952846 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Oct 28 00:12:40.952915 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Oct 28 00:12:40.952980 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Oct 28 00:12:40.953055 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Oct 28 00:12:40.953121 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Oct 28 00:12:40.953185 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Oct 28 00:12:40.953257 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Oct 28 00:12:40.953329 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Oct 28 00:12:40.953398 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Oct 28 00:12:40.953467 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Oct 28 00:12:40.953533 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Oct 28 00:12:40.953599 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 28 00:12:40.953669 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.953736 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 28 00:12:40.953804 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Oct 28 00:12:40.953869 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Oct 28 00:12:40.953935 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.954003 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.956102 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 28 00:12:40.956178 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Oct 28 00:12:40.956251 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Oct 28 00:12:40.956320 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Oct 28 00:12:40.956387 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.956459 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.956527 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 28 00:12:40.956597 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Oct 28 00:12:40.956663 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Oct 28 00:12:40.956729 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Oct 28 00:12:40.956795 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.956866 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.956932 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 28 00:12:40.957012 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Oct 28 00:12:40.957113 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Oct 28 00:12:40.957180 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.957254 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.957321 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 28 00:12:40.957390 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Oct 28 00:12:40.957459 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 28 00:12:40.957524 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.957595 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.957661 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 28 00:12:40.957726 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Oct 28 00:12:40.957791 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Oct 28 00:12:40.957860 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.957931 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.957997 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 28 00:12:40.958071 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Oct 28 00:12:40.958137 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Oct 28 00:12:40.958207 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.958280 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.958346 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 28 00:12:40.958412 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Oct 28 00:12:40.958477 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Oct 28 00:12:40.958544 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.958615 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.958683 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 28 00:12:40.958748 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Oct 28 00:12:40.958815 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Oct 28 00:12:40.958879 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.958949 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.959015 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 28 00:12:40.959937 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Oct 28 00:12:40.960015 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Oct 28 00:12:40.960096 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Oct 28 00:12:40.960165 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.960237 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.960318 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 28 00:12:40.960390 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Oct 28 00:12:40.960456 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Oct 28 00:12:40.960523 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Oct 28 00:12:40.960590 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.960661 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.960731 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 28 00:12:40.960797 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Oct 28 00:12:40.960862 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 28 00:12:40.960927 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.960997 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.961072 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 28 00:12:40.961146 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Oct 28 00:12:40.961214 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 28 00:12:40.961279 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.961350 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.961417 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 28 00:12:40.961483 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Oct 28 00:12:40.961552 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Oct 28 00:12:40.961618 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.961687 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.961756 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 28 00:12:40.961822 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Oct 28 00:12:40.961888 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Oct 28 00:12:40.961956 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.962038 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.962107 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 28 00:12:40.962173 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Oct 28 00:12:40.962238 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 28 00:12:40.962304 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.962377 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.962443 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 28 00:12:40.962507 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Oct 28 00:12:40.962573 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Oct 28 00:12:40.962637 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 28 00:12:40.962702 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.962774 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.962840 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 28 00:12:40.962906 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Oct 28 00:12:40.962975 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Oct 28 00:12:40.963059 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Oct 28 00:12:40.963128 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.963203 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.963270 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 28 00:12:40.963337 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Oct 28 00:12:40.963406 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Oct 28 00:12:40.963471 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Oct 28 00:12:40.963537 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.963608 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.963674 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 28 00:12:40.963742 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Oct 28 00:12:40.963809 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 28 00:12:40.963874 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.963945 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.964011 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 28 00:12:40.964092 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Oct 28 00:12:40.964163 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 28 00:12:40.964231 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.964303 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.964379 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 28 00:12:40.964449 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Oct 28 00:12:40.964522 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Oct 28 00:12:40.964602 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.964859 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.964932 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 28 00:12:40.965000 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Oct 28 00:12:40.965078 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Oct 28 00:12:40.965148 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.965226 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.965294 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 28 00:12:40.965359 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Oct 28 00:12:40.965425 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 28 00:12:40.965491 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.965561 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.965631 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 28 00:12:40.965696 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Oct 28 00:12:40.965761 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Oct 28 00:12:40.965826 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Oct 28 00:12:40.965890 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.965960 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.966073 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 28 00:12:40.967076 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Oct 28 00:12:40.967221 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Oct 28 00:12:40.967300 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Oct 28 00:12:40.967661 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.967737 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.967811 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 28 00:12:40.967880 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Oct 28 00:12:40.967947 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Oct 28 00:12:40.968013 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.968096 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.968164 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 28 00:12:40.968233 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Oct 28 00:12:40.968298 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 28 00:12:40.968364 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.968438 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.968504 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 28 00:12:40.968570 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Oct 28 00:12:40.968639 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Oct 28 00:12:40.968704 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.968774 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.968840 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 28 00:12:40.968905 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Oct 28 00:12:40.968970 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Oct 28 00:12:40.969058 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.969131 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.969197 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 28 00:12:40.969263 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Oct 28 00:12:40.969328 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Oct 28 00:12:40.969394 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.969467 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 28 00:12:40.969534 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 28 00:12:40.969600 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Oct 28 00:12:40.969666 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 28 00:12:40.969731 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.969801 kernel: pci_bus 0000:01: extended config space not accessible Oct 28 00:12:40.969872 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 28 00:12:40.969939 kernel: pci_bus 0000:02: extended config space not accessible Oct 28 00:12:40.969949 kernel: acpiphp: Slot [32] registered Oct 28 00:12:40.969956 kernel: acpiphp: Slot [33] registered Oct 28 00:12:40.969963 kernel: acpiphp: Slot [34] registered Oct 28 00:12:40.969969 kernel: acpiphp: Slot [35] registered Oct 28 00:12:40.969978 kernel: acpiphp: Slot [36] registered Oct 28 00:12:40.969985 kernel: acpiphp: Slot [37] registered Oct 28 00:12:40.969991 kernel: acpiphp: Slot [38] registered Oct 28 00:12:40.969997 kernel: acpiphp: Slot [39] registered Oct 28 00:12:40.970004 kernel: acpiphp: Slot [40] registered Oct 28 00:12:40.970011 kernel: acpiphp: Slot [41] registered Oct 28 00:12:40.970017 kernel: acpiphp: Slot [42] registered Oct 28 00:12:40.970033 kernel: acpiphp: Slot [43] registered Oct 28 00:12:40.970044 kernel: acpiphp: Slot [44] registered Oct 28 00:12:40.970050 kernel: acpiphp: Slot [45] registered Oct 28 00:12:40.970057 kernel: acpiphp: Slot [46] registered Oct 28 00:12:40.970063 kernel: acpiphp: Slot [47] registered Oct 28 00:12:40.970070 kernel: acpiphp: Slot [48] registered Oct 28 00:12:40.970076 kernel: acpiphp: Slot [49] registered Oct 28 00:12:40.970083 kernel: acpiphp: Slot [50] registered Oct 28 00:12:40.970090 kernel: acpiphp: Slot [51] registered Oct 28 00:12:40.970097 kernel: acpiphp: Slot [52] registered Oct 28 00:12:40.970103 kernel: acpiphp: Slot [53] registered Oct 28 00:12:40.970109 kernel: acpiphp: Slot [54] registered Oct 28 00:12:40.970116 kernel: acpiphp: Slot [55] registered Oct 28 00:12:40.970122 kernel: acpiphp: Slot [56] registered Oct 28 00:12:40.970129 kernel: acpiphp: Slot [57] registered Oct 28 00:12:40.970137 kernel: acpiphp: Slot [58] registered Oct 28 00:12:40.970147 kernel: acpiphp: Slot [59] registered Oct 28 00:12:40.970153 kernel: acpiphp: Slot [60] registered Oct 28 00:12:40.970160 kernel: acpiphp: Slot [61] registered Oct 28 00:12:40.970166 kernel: acpiphp: Slot [62] registered Oct 28 00:12:40.970173 kernel: acpiphp: Slot [63] registered Oct 28 00:12:40.970245 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Oct 28 00:12:40.970313 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Oct 28 00:12:40.970381 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Oct 28 00:12:40.970447 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Oct 28 00:12:40.970512 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Oct 28 00:12:40.970577 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Oct 28 00:12:40.970651 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Oct 28 00:12:40.970723 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Oct 28 00:12:40.970790 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Oct 28 00:12:40.970858 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Oct 28 00:12:40.970924 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Oct 28 00:12:40.970992 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Oct 28 00:12:40.971070 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 28 00:12:40.971147 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 28 00:12:40.971217 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 28 00:12:40.971284 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 28 00:12:40.971353 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 28 00:12:40.971420 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 28 00:12:40.971487 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 28 00:12:40.971559 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 28 00:12:40.971632 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Oct 28 00:12:40.971700 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Oct 28 00:12:40.971766 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Oct 28 00:12:40.971831 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Oct 28 00:12:40.971898 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Oct 28 00:12:40.971967 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Oct 28 00:12:40.974630 kernel: pci 0000:0b:00.0: supports D1 D2 Oct 28 00:12:40.974716 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Oct 28 00:12:40.974789 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Oct 28 00:12:40.974860 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 28 00:12:40.974931 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 28 00:12:40.975001 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 28 00:12:40.975091 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 28 00:12:40.975161 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 28 00:12:40.975230 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 28 00:12:40.975298 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 28 00:12:40.975366 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 28 00:12:40.975438 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 28 00:12:40.975506 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 28 00:12:40.975574 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 28 00:12:40.975643 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 28 00:12:40.975712 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 28 00:12:40.975780 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 28 00:12:40.975849 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 28 00:12:40.975919 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 28 00:12:40.975988 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 28 00:12:40.977096 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 28 00:12:40.977183 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 28 00:12:40.977256 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 28 00:12:40.977327 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 28 00:12:40.977404 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 28 00:12:40.977473 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 28 00:12:40.977541 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 28 00:12:40.977551 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Oct 28 00:12:40.977558 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Oct 28 00:12:40.977565 kernel: ACPI: PCI: Interrupt link LNKB disabled Oct 28 00:12:40.977574 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Oct 28 00:12:40.977580 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Oct 28 00:12:40.977587 kernel: iommu: Default domain type: Translated Oct 28 00:12:40.977594 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Oct 28 00:12:40.977601 kernel: PCI: Using ACPI for IRQ routing Oct 28 00:12:40.977607 kernel: PCI: pci_cache_line_size set to 64 bytes Oct 28 00:12:40.977614 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Oct 28 00:12:40.977622 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Oct 28 00:12:40.977689 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Oct 28 00:12:40.977754 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Oct 28 00:12:40.977819 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Oct 28 00:12:40.977828 kernel: vgaarb: loaded Oct 28 00:12:40.977835 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Oct 28 00:12:40.977842 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Oct 28 00:12:40.977850 kernel: clocksource: Switched to clocksource tsc-early Oct 28 00:12:40.977857 kernel: VFS: Disk quotas dquot_6.6.0 Oct 28 00:12:40.977864 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Oct 28 00:12:40.977871 kernel: pnp: PnP ACPI init Oct 28 00:12:40.977941 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Oct 28 00:12:40.978004 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Oct 28 00:12:40.978082 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Oct 28 00:12:40.978150 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Oct 28 00:12:40.978215 kernel: pnp 00:06: [dma 2] Oct 28 00:12:40.978281 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Oct 28 00:12:40.978343 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Oct 28 00:12:40.978404 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Oct 28 00:12:40.978415 kernel: pnp: PnP ACPI: found 8 devices Oct 28 00:12:40.978422 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Oct 28 00:12:40.978429 kernel: NET: Registered PF_INET protocol family Oct 28 00:12:40.978436 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Oct 28 00:12:40.978443 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Oct 28 00:12:40.978450 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Oct 28 00:12:40.978456 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Oct 28 00:12:40.978464 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 28 00:12:40.978471 kernel: TCP: Hash tables configured (established 16384 bind 16384) Oct 28 00:12:40.978478 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 28 00:12:40.978484 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 28 00:12:40.978491 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Oct 28 00:12:40.978497 kernel: NET: Registered PF_XDP protocol family Oct 28 00:12:40.978563 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Oct 28 00:12:40.978635 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Oct 28 00:12:40.978702 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Oct 28 00:12:40.978769 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Oct 28 00:12:40.978836 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Oct 28 00:12:40.978903 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Oct 28 00:12:40.978970 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Oct 28 00:12:40.979057 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Oct 28 00:12:40.979128 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Oct 28 00:12:40.979200 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Oct 28 00:12:40.979266 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Oct 28 00:12:40.979332 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Oct 28 00:12:40.979398 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Oct 28 00:12:40.979467 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Oct 28 00:12:40.979534 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Oct 28 00:12:40.979602 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Oct 28 00:12:40.979668 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Oct 28 00:12:40.979747 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Oct 28 00:12:40.979814 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Oct 28 00:12:40.979882 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Oct 28 00:12:40.979948 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Oct 28 00:12:40.980014 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Oct 28 00:12:40.980094 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Oct 28 00:12:40.980161 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Oct 28 00:12:40.980227 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Oct 28 00:12:40.980293 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.980361 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.980427 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.980492 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.980559 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.980631 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.980705 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.980781 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.980847 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.980913 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.980979 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.981058 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.981124 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.981204 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.981270 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.981335 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.981400 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.981466 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.981532 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.981597 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.981665 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.981746 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.981813 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.981878 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.981944 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982010 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.982092 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982159 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.982224 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982290 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.982355 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982420 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.982487 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982552 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.982618 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982684 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.982753 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982831 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.982897 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.982966 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.983038 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.983105 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.983171 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.983236 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.983302 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.983368 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.983437 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.983502 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.983568 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.983633 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.983699 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.983773 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.983854 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.983929 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.984409 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.984490 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.984567 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.984636 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.984704 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.984772 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.984839 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.984910 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.984977 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.985058 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.985127 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.985193 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.985259 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.985325 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.985395 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.985462 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.985528 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.985593 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.985659 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.985724 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.985793 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.985859 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.985938 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.986005 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.986087 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.986169 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.986241 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.986305 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.986371 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Oct 28 00:12:40.986437 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Oct 28 00:12:40.986504 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 28 00:12:40.986570 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Oct 28 00:12:40.986635 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Oct 28 00:12:40.986702 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Oct 28 00:12:40.986767 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 28 00:12:40.986837 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Oct 28 00:12:40.986912 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 28 00:12:40.986982 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Oct 28 00:12:40.987075 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Oct 28 00:12:40.987141 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Oct 28 00:12:40.987213 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 28 00:12:40.987279 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Oct 28 00:12:40.987344 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Oct 28 00:12:40.987409 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Oct 28 00:12:40.987475 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 28 00:12:40.987541 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Oct 28 00:12:40.987606 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Oct 28 00:12:40.987675 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Oct 28 00:12:40.987740 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 28 00:12:40.987819 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Oct 28 00:12:40.987885 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Oct 28 00:12:40.987956 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 28 00:12:40.988063 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Oct 28 00:12:40.988132 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 28 00:12:40.988201 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 28 00:12:40.988280 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Oct 28 00:12:40.988346 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Oct 28 00:12:40.988413 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 28 00:12:40.988478 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Oct 28 00:12:40.988544 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Oct 28 00:12:40.988611 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 28 00:12:40.988676 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Oct 28 00:12:40.988741 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Oct 28 00:12:40.988810 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Oct 28 00:12:40.988877 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 28 00:12:40.988942 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Oct 28 00:12:40.989010 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Oct 28 00:12:40.989090 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Oct 28 00:12:40.989174 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 28 00:12:40.989991 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Oct 28 00:12:40.990089 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Oct 28 00:12:40.990163 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Oct 28 00:12:40.990238 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 28 00:12:40.990311 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Oct 28 00:12:40.990379 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Oct 28 00:12:40.990446 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Oct 28 00:12:40.990513 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 28 00:12:40.990580 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Oct 28 00:12:40.990646 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 28 00:12:40.990713 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 28 00:12:40.990781 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Oct 28 00:12:40.990848 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 28 00:12:40.990918 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 28 00:12:40.990985 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Oct 28 00:12:40.991064 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Oct 28 00:12:40.991132 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 28 00:12:40.991214 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Oct 28 00:12:40.991282 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Oct 28 00:12:40.991350 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 28 00:12:40.991418 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Oct 28 00:12:40.991485 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 28 00:12:40.991554 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 28 00:12:40.991623 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Oct 28 00:12:40.991690 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Oct 28 00:12:40.991755 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 28 00:12:40.991823 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 28 00:12:40.991889 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Oct 28 00:12:40.991955 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Oct 28 00:12:40.992106 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Oct 28 00:12:40.992183 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 28 00:12:40.992253 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Oct 28 00:12:40.992320 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Oct 28 00:12:40.992387 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Oct 28 00:12:40.992454 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 28 00:12:40.992519 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Oct 28 00:12:40.992586 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 28 00:12:40.992655 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 28 00:12:40.992721 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Oct 28 00:12:40.992787 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 28 00:12:40.992855 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 28 00:12:40.993035 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Oct 28 00:12:40.993106 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Oct 28 00:12:40.993178 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 28 00:12:40.993244 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Oct 28 00:12:40.993310 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Oct 28 00:12:40.993379 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 28 00:12:40.993446 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Oct 28 00:12:40.993512 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 28 00:12:40.993584 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 28 00:12:40.993651 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Oct 28 00:12:40.993719 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Oct 28 00:12:40.993785 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Oct 28 00:12:40.993852 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 28 00:12:40.993918 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Oct 28 00:12:40.993984 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Oct 28 00:12:40.994070 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Oct 28 00:12:40.994142 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 28 00:12:40.994209 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Oct 28 00:12:40.994275 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Oct 28 00:12:40.994344 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 28 00:12:40.994410 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Oct 28 00:12:40.994475 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 28 00:12:40.994546 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 28 00:12:40.994613 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Oct 28 00:12:40.994680 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Oct 28 00:12:40.994749 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 28 00:12:40.994816 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Oct 28 00:12:40.994882 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Oct 28 00:12:40.994956 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 28 00:12:40.995032 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Oct 28 00:12:40.995105 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Oct 28 00:12:40.995174 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 28 00:12:40.995251 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Oct 28 00:12:40.995348 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 28 00:12:40.995416 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Oct 28 00:12:40.995476 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Oct 28 00:12:40.995539 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Oct 28 00:12:40.995605 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Oct 28 00:12:40.995689 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Oct 28 00:12:40.995755 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Oct 28 00:12:40.995819 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Oct 28 00:12:40.995883 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 28 00:12:40.995945 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Oct 28 00:12:40.996005 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Oct 28 00:12:40.996082 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Oct 28 00:12:40.996380 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Oct 28 00:12:40.996446 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Oct 28 00:12:40.996516 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Oct 28 00:12:40.996578 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Oct 28 00:12:40.996638 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Oct 28 00:12:40.996704 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Oct 28 00:12:40.996766 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Oct 28 00:12:40.996830 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Oct 28 00:12:40.996896 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Oct 28 00:12:40.996958 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Oct 28 00:12:40.997019 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Oct 28 00:12:40.997234 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Oct 28 00:12:40.997301 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Oct 28 00:12:40.997368 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Oct 28 00:12:40.997430 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 28 00:12:40.997498 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Oct 28 00:12:40.997560 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Oct 28 00:12:40.997625 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Oct 28 00:12:40.997690 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Oct 28 00:12:40.997755 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Oct 28 00:12:40.997817 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Oct 28 00:12:40.997882 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Oct 28 00:12:40.997946 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Oct 28 00:12:40.998007 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Oct 28 00:12:40.998083 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Oct 28 00:12:40.998146 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Oct 28 00:12:40.998207 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Oct 28 00:12:40.998274 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Oct 28 00:12:40.998339 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Oct 28 00:12:40.998400 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Oct 28 00:12:40.998580 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Oct 28 00:12:40.998647 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 28 00:12:40.998713 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Oct 28 00:12:40.998778 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 28 00:12:40.998847 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Oct 28 00:12:40.998908 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Oct 28 00:12:40.998974 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Oct 28 00:12:40.999046 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Oct 28 00:12:40.999114 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Oct 28 00:12:40.999179 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 28 00:12:40.999244 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Oct 28 00:12:40.999305 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Oct 28 00:12:40.999366 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 28 00:12:40.999430 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Oct 28 00:12:40.999494 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Oct 28 00:12:40.999555 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Oct 28 00:12:40.999623 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Oct 28 00:12:40.999684 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Oct 28 00:12:40.999744 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Oct 28 00:12:40.999809 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Oct 28 00:12:40.999873 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 28 00:12:40.999938 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Oct 28 00:12:41.000000 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 28 00:12:41.000076 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Oct 28 00:12:41.000137 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Oct 28 00:12:41.000207 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Oct 28 00:12:41.000269 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Oct 28 00:12:41.000334 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Oct 28 00:12:41.000395 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 28 00:12:41.000459 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Oct 28 00:12:41.000523 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Oct 28 00:12:41.000584 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Oct 28 00:12:41.000648 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Oct 28 00:12:41.000709 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Oct 28 00:12:41.000770 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Oct 28 00:12:41.000837 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Oct 28 00:12:41.000901 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Oct 28 00:12:41.000966 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Oct 28 00:12:41.001039 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 28 00:12:41.001109 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Oct 28 00:12:41.001183 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Oct 28 00:12:41.001280 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Oct 28 00:12:41.001342 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Oct 28 00:12:41.001408 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Oct 28 00:12:41.003717 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Oct 28 00:12:41.003800 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Oct 28 00:12:41.003872 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 28 00:12:41.003951 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Oct 28 00:12:41.003968 kernel: PCI: CLS 32 bytes, default 64 Oct 28 00:12:41.003980 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Oct 28 00:12:41.003993 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Oct 28 00:12:41.004003 kernel: clocksource: Switched to clocksource tsc Oct 28 00:12:41.004010 kernel: Initialise system trusted keyrings Oct 28 00:12:41.004019 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Oct 28 00:12:41.004036 kernel: Key type asymmetric registered Oct 28 00:12:41.004043 kernel: Asymmetric key parser 'x509' registered Oct 28 00:12:41.004054 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Oct 28 00:12:41.004063 kernel: io scheduler mq-deadline registered Oct 28 00:12:41.004070 kernel: io scheduler kyber registered Oct 28 00:12:41.004077 kernel: io scheduler bfq registered Oct 28 00:12:41.004155 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Oct 28 00:12:41.004228 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.004315 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Oct 28 00:12:41.004406 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.004484 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Oct 28 00:12:41.004556 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.004630 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Oct 28 00:12:41.004709 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.004808 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Oct 28 00:12:41.004878 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.004946 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Oct 28 00:12:41.005019 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.005116 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Oct 28 00:12:41.005194 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.005262 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Oct 28 00:12:41.005330 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.005398 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Oct 28 00:12:41.005464 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.005535 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Oct 28 00:12:41.005604 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.005676 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Oct 28 00:12:41.005744 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.005813 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Oct 28 00:12:41.005892 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.005977 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Oct 28 00:12:41.010096 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.010191 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Oct 28 00:12:41.010264 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.010335 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Oct 28 00:12:41.010404 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.010480 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Oct 28 00:12:41.010550 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.010634 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Oct 28 00:12:41.010704 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.010773 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Oct 28 00:12:41.010848 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.010921 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Oct 28 00:12:41.010989 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.011076 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Oct 28 00:12:41.011155 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.011233 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Oct 28 00:12:41.011311 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.011387 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Oct 28 00:12:41.011455 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.011523 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Oct 28 00:12:41.011591 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.011669 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Oct 28 00:12:41.011739 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.011816 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Oct 28 00:12:41.011891 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.011960 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Oct 28 00:12:41.012421 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.012516 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Oct 28 00:12:41.012589 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.012663 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Oct 28 00:12:41.012733 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.012803 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Oct 28 00:12:41.012872 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.012942 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Oct 28 00:12:41.013011 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.013115 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Oct 28 00:12:41.013186 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.013256 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Oct 28 00:12:41.013325 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 28 00:12:41.013338 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Oct 28 00:12:41.013345 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Oct 28 00:12:41.013354 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Oct 28 00:12:41.013361 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Oct 28 00:12:41.013369 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Oct 28 00:12:41.013376 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Oct 28 00:12:41.013446 kernel: rtc_cmos 00:01: registered as rtc0 Oct 28 00:12:41.013610 kernel: rtc_cmos 00:01: setting system clock to 2025-10-28T00:12:39 UTC (1761610359) Oct 28 00:12:41.013624 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Oct 28 00:12:41.013690 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Oct 28 00:12:41.013705 kernel: intel_pstate: CPU model not supported Oct 28 00:12:41.013712 kernel: NET: Registered PF_INET6 protocol family Oct 28 00:12:41.013719 kernel: Segment Routing with IPv6 Oct 28 00:12:41.013727 kernel: In-situ OAM (IOAM) with IPv6 Oct 28 00:12:41.013734 kernel: NET: Registered PF_PACKET protocol family Oct 28 00:12:41.013743 kernel: Key type dns_resolver registered Oct 28 00:12:41.013751 kernel: IPI shorthand broadcast: enabled Oct 28 00:12:41.013758 kernel: sched_clock: Marking stable (1458003268, 170241394)->(1642129075, -13884413) Oct 28 00:12:41.013765 kernel: registered taskstats version 1 Oct 28 00:12:41.013772 kernel: Loading compiled-in X.509 certificates Oct 28 00:12:41.013779 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.54-flatcar: 83e3b158efa5b2676019c86f243fd682d3067554' Oct 28 00:12:41.013786 kernel: Demotion targets for Node 0: null Oct 28 00:12:41.013794 kernel: Key type .fscrypt registered Oct 28 00:12:41.013801 kernel: Key type fscrypt-provisioning registered Oct 28 00:12:41.013808 kernel: ima: No TPM chip found, activating TPM-bypass! Oct 28 00:12:41.013815 kernel: ima: Allocated hash algorithm: sha1 Oct 28 00:12:41.013822 kernel: ima: No architecture policies found Oct 28 00:12:41.013829 kernel: clk: Disabling unused clocks Oct 28 00:12:41.013836 kernel: Freeing unused kernel image (initmem) memory: 15960K Oct 28 00:12:41.013844 kernel: Write protecting the kernel read-only data: 40960k Oct 28 00:12:41.013851 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Oct 28 00:12:41.013859 kernel: Run /init as init process Oct 28 00:12:41.013865 kernel: with arguments: Oct 28 00:12:41.013872 kernel: /init Oct 28 00:12:41.013880 kernel: with environment: Oct 28 00:12:41.013886 kernel: HOME=/ Oct 28 00:12:41.013893 kernel: TERM=linux Oct 28 00:12:41.013903 kernel: SCSI subsystem initialized Oct 28 00:12:41.013910 kernel: VMware PVSCSI driver - version 1.0.7.0-k Oct 28 00:12:41.013917 kernel: vmw_pvscsi: using 64bit dma Oct 28 00:12:41.013924 kernel: vmw_pvscsi: max_id: 16 Oct 28 00:12:41.013931 kernel: vmw_pvscsi: setting ring_pages to 8 Oct 28 00:12:41.013938 kernel: vmw_pvscsi: enabling reqCallThreshold Oct 28 00:12:41.013946 kernel: vmw_pvscsi: driver-based request coalescing enabled Oct 28 00:12:41.013954 kernel: vmw_pvscsi: using MSI-X Oct 28 00:12:41.014050 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Oct 28 00:12:41.014128 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Oct 28 00:12:41.014209 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Oct 28 00:12:41.014289 kernel: sd 0:0:0:0: [sda] 25804800 512-byte logical blocks: (13.2 GB/12.3 GiB) Oct 28 00:12:41.014372 kernel: sd 0:0:0:0: [sda] Write Protect is off Oct 28 00:12:41.014447 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Oct 28 00:12:41.014519 kernel: sd 0:0:0:0: [sda] Cache data unavailable Oct 28 00:12:41.014591 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Oct 28 00:12:41.014601 kernel: libata version 3.00 loaded. Oct 28 00:12:41.014609 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 28 00:12:41.014680 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Oct 28 00:12:41.014758 kernel: ata_piix 0000:00:07.1: version 2.13 Oct 28 00:12:41.014836 kernel: scsi host1: ata_piix Oct 28 00:12:41.014918 kernel: scsi host2: ata_piix Oct 28 00:12:41.014929 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Oct 28 00:12:41.014937 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Oct 28 00:12:41.014944 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Oct 28 00:12:41.015020 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Oct 28 00:12:41.015107 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Oct 28 00:12:41.015117 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Oct 28 00:12:41.015126 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Oct 28 00:12:41.015133 kernel: device-mapper: uevent: version 1.0.3 Oct 28 00:12:41.015140 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Oct 28 00:12:41.015211 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Oct 28 00:12:41.015223 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Oct 28 00:12:41.015231 kernel: Invalid ELF header magic: != \u007fELF Oct 28 00:12:41.015238 kernel: Invalid ELF header magic: != \u007fELF Oct 28 00:12:41.015245 kernel: raid6: avx2x4 gen() 42300 MB/s Oct 28 00:12:41.015252 kernel: raid6: avx2x2 gen() 50012 MB/s Oct 28 00:12:41.015259 kernel: raid6: avx2x1 gen() 44334 MB/s Oct 28 00:12:41.015265 kernel: raid6: using algorithm avx2x2 gen() 50012 MB/s Oct 28 00:12:41.015274 kernel: raid6: .... xor() 32091 MB/s, rmw enabled Oct 28 00:12:41.015281 kernel: raid6: using avx2x2 recovery algorithm Oct 28 00:12:41.015288 kernel: Invalid ELF header magic: != \u007fELF Oct 28 00:12:41.015295 kernel: Invalid ELF header magic: != \u007fELF Oct 28 00:12:41.015302 kernel: Invalid ELF header magic: != \u007fELF Oct 28 00:12:41.015308 kernel: xor: automatically using best checksumming function avx Oct 28 00:12:41.015315 kernel: Invalid ELF header magic: != \u007fELF Oct 28 00:12:41.015322 kernel: Btrfs loaded, zoned=no, fsverity=no Oct 28 00:12:41.015331 kernel: BTRFS: device fsid 4fda63c0-e2d9-4674-a954-1a6d4907fb92 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (192) Oct 28 00:12:41.015338 kernel: BTRFS info (device dm-0): first mount of filesystem 4fda63c0-e2d9-4674-a954-1a6d4907fb92 Oct 28 00:12:41.015345 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:12:41.015352 kernel: BTRFS info (device dm-0): enabling ssd optimizations Oct 28 00:12:41.015359 kernel: BTRFS info (device dm-0): disabling log replay at mount time Oct 28 00:12:41.015366 kernel: BTRFS info (device dm-0): enabling free space tree Oct 28 00:12:41.015373 kernel: Invalid ELF header magic: != \u007fELF Oct 28 00:12:41.015381 kernel: loop: module loaded Oct 28 00:12:41.015388 kernel: loop0: detected capacity change from 0 to 100120 Oct 28 00:12:41.015395 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Oct 28 00:12:41.015403 systemd[1]: Successfully made /usr/ read-only. Oct 28 00:12:41.015412 systemd[1]: systemd 257.7 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 28 00:12:41.015420 systemd[1]: Detected virtualization vmware. Oct 28 00:12:41.015431 systemd[1]: Detected architecture x86-64. Oct 28 00:12:41.015438 systemd[1]: Running in initrd. Oct 28 00:12:41.015445 systemd[1]: No hostname configured, using default hostname. Oct 28 00:12:41.015452 systemd[1]: Hostname set to . Oct 28 00:12:41.015461 systemd[1]: Initializing machine ID from random generator. Oct 28 00:12:41.015471 systemd[1]: Queued start job for default target initrd.target. Oct 28 00:12:41.015480 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Oct 28 00:12:41.015487 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 28 00:12:41.015494 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 28 00:12:41.015502 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Oct 28 00:12:41.015509 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 28 00:12:41.015516 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Oct 28 00:12:41.015525 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Oct 28 00:12:41.015534 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 28 00:12:41.015541 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 28 00:12:41.015548 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Oct 28 00:12:41.015556 systemd[1]: Reached target paths.target - Path Units. Oct 28 00:12:41.015563 systemd[1]: Reached target slices.target - Slice Units. Oct 28 00:12:41.015571 systemd[1]: Reached target swap.target - Swaps. Oct 28 00:12:41.015578 systemd[1]: Reached target timers.target - Timer Units. Oct 28 00:12:41.015585 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Oct 28 00:12:41.015593 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 28 00:12:41.015600 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Oct 28 00:12:41.015607 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Oct 28 00:12:41.015615 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 28 00:12:41.015623 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 28 00:12:41.015631 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 28 00:12:41.015638 systemd[1]: Reached target sockets.target - Socket Units. Oct 28 00:12:41.015646 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Oct 28 00:12:41.015653 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Oct 28 00:12:41.015660 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 28 00:12:41.015667 systemd[1]: Finished network-cleanup.service - Network Cleanup. Oct 28 00:12:41.015676 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Oct 28 00:12:41.015683 systemd[1]: Starting systemd-fsck-usr.service... Oct 28 00:12:41.015691 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 28 00:12:41.015698 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 28 00:12:41.015705 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:12:41.015714 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Oct 28 00:12:41.015738 systemd-journald[326]: Collecting audit messages is disabled. Oct 28 00:12:41.015757 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 28 00:12:41.015765 systemd[1]: Finished systemd-fsck-usr.service. Oct 28 00:12:41.015772 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 28 00:12:41.015780 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 28 00:12:41.015788 systemd-journald[326]: Journal started Oct 28 00:12:41.015804 systemd-journald[326]: Runtime Journal (/run/log/journal/bad3cf6de18144b2aa974651ab1775f2) is 4.8M, max 38.5M, 33.7M free. Oct 28 00:12:41.018485 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 28 00:12:41.018518 systemd[1]: Started systemd-journald.service - Journal Service. Oct 28 00:12:41.026076 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Oct 28 00:12:41.035068 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 28 00:12:41.042585 systemd-modules-load[331]: Inserted module 'br_netfilter' Oct 28 00:12:41.043033 kernel: Bridge firewalling registered Oct 28 00:12:41.046111 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 28 00:12:41.046711 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 28 00:12:41.051673 systemd-tmpfiles[347]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Oct 28 00:12:41.056100 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 28 00:12:41.059637 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:12:41.061408 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 28 00:12:41.062824 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 28 00:12:41.070263 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 28 00:12:41.072236 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 28 00:12:41.082799 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 28 00:12:41.084125 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Oct 28 00:12:41.098266 dracut-cmdline[371]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=bb8cbc137ff563234eef33bdd51a5c9ee67c90d62b83654276e2a4d312ac5ee1 Oct 28 00:12:41.109968 systemd-resolved[359]: Positive Trust Anchors: Oct 28 00:12:41.109978 systemd-resolved[359]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 28 00:12:41.109980 systemd-resolved[359]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Oct 28 00:12:41.110010 systemd-resolved[359]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 28 00:12:41.131235 systemd-resolved[359]: Defaulting to hostname 'linux'. Oct 28 00:12:41.131855 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 28 00:12:41.132085 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 28 00:12:41.202048 kernel: Loading iSCSI transport class v2.0-870. Oct 28 00:12:41.214264 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Oct 28 00:12:41.232046 kernel: iscsi: registered transport (tcp) Oct 28 00:12:41.269333 kernel: iscsi: registered transport (qla4xxx) Oct 28 00:12:41.269391 kernel: QLogic iSCSI HBA Driver Oct 28 00:12:41.295995 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 28 00:12:41.308682 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 28 00:12:41.309044 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 28 00:12:41.333483 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Oct 28 00:12:41.334513 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Oct 28 00:12:41.335097 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Oct 28 00:12:41.361631 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Oct 28 00:12:41.362706 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 28 00:12:41.381156 systemd-udevd[612]: Using default interface naming scheme 'v257'. Oct 28 00:12:41.388242 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 28 00:12:41.390272 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Oct 28 00:12:41.402223 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 28 00:12:41.403415 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 28 00:12:41.410857 dracut-pre-trigger[694]: rd.md=0: removing MD RAID activation Oct 28 00:12:41.431080 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Oct 28 00:12:41.433061 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 28 00:12:41.437072 systemd-networkd[717]: lo: Link UP Oct 28 00:12:41.437080 systemd-networkd[717]: lo: Gained carrier Oct 28 00:12:41.437375 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 28 00:12:41.437697 systemd[1]: Reached target network.target - Network. Oct 28 00:12:41.525154 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 28 00:12:41.538015 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Oct 28 00:12:41.619038 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Oct 28 00:12:41.622041 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Oct 28 00:12:41.628147 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Oct 28 00:12:41.651250 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Oct 28 00:12:41.651419 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Oct 28 00:12:41.650192 systemd-networkd[717]: eth0: Interface name change detected, renamed to ens192. Oct 28 00:12:41.654048 kernel: cryptd: max_cpu_qlen set to 1000 Oct 28 00:12:41.655337 (udev-worker)[759]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Oct 28 00:12:41.657174 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 28 00:12:41.657287 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:12:41.657780 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:12:41.659519 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:12:41.670645 systemd-networkd[717]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Oct 28 00:12:41.676423 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Oct 28 00:12:41.676573 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Oct 28 00:12:41.675949 systemd-networkd[717]: ens192: Link UP Oct 28 00:12:41.675952 systemd-networkd[717]: ens192: Gained carrier Oct 28 00:12:41.678040 kernel: AES CTR mode by8 optimization enabled Oct 28 00:12:41.696812 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:12:41.805670 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Oct 28 00:12:41.810334 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Oct 28 00:12:41.815893 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Oct 28 00:12:41.821613 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Oct 28 00:12:41.822408 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Oct 28 00:12:41.829248 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Oct 28 00:12:41.829987 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Oct 28 00:12:41.830433 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 28 00:12:41.830670 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 28 00:12:41.831690 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Oct 28 00:12:41.872372 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Oct 28 00:12:42.898047 disk-uuid[860]: Warning: The kernel is still using the old partition table. Oct 28 00:12:42.898047 disk-uuid[860]: The new table will be used at the next reboot or after you Oct 28 00:12:42.898047 disk-uuid[860]: run partprobe(8) or kpartx(8) Oct 28 00:12:42.898047 disk-uuid[860]: The operation has completed successfully. Oct 28 00:12:42.904330 systemd[1]: disk-uuid.service: Deactivated successfully. Oct 28 00:12:42.904390 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Oct 28 00:12:42.905273 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Oct 28 00:12:42.928898 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (879) Oct 28 00:12:42.928938 kernel: BTRFS info (device sda6): first mount of filesystem e5ad038a-d5ed-4440-8f1c-902f5112301b Oct 28 00:12:42.928953 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:12:42.935172 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 28 00:12:42.935230 kernel: BTRFS info (device sda6): enabling free space tree Oct 28 00:12:42.940049 kernel: BTRFS info (device sda6): last unmount of filesystem e5ad038a-d5ed-4440-8f1c-902f5112301b Oct 28 00:12:42.940598 systemd[1]: Finished ignition-setup.service - Ignition (setup). Oct 28 00:12:42.942136 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Oct 28 00:12:43.102140 systemd-networkd[717]: ens192: Gained IPv6LL Oct 28 00:12:43.137860 ignition[898]: Ignition 2.22.0 Oct 28 00:12:43.137869 ignition[898]: Stage: fetch-offline Oct 28 00:12:43.137893 ignition[898]: no configs at "/usr/lib/ignition/base.d" Oct 28 00:12:43.137899 ignition[898]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:12:43.137950 ignition[898]: parsed url from cmdline: "" Oct 28 00:12:43.137952 ignition[898]: no config URL provided Oct 28 00:12:43.137955 ignition[898]: reading system config file "/usr/lib/ignition/user.ign" Oct 28 00:12:43.137960 ignition[898]: no config at "/usr/lib/ignition/user.ign" Oct 28 00:12:43.138519 ignition[898]: config successfully fetched Oct 28 00:12:43.138537 ignition[898]: parsing config with SHA512: d37fe120eaaec93157423cec456491d36d937ae4c003d2514a679e023f4b497b88781ed04e73e9ef59991ae0c23abf610d0ef164e31727cd4753215a31769c29 Oct 28 00:12:43.140859 unknown[898]: fetched base config from "system" Oct 28 00:12:43.141195 ignition[898]: fetch-offline: fetch-offline passed Oct 28 00:12:43.140868 unknown[898]: fetched user config from "vmware" Oct 28 00:12:43.141237 ignition[898]: Ignition finished successfully Oct 28 00:12:43.142472 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Oct 28 00:12:43.142938 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Oct 28 00:12:43.143624 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Oct 28 00:12:43.168645 ignition[905]: Ignition 2.22.0 Oct 28 00:12:43.168652 ignition[905]: Stage: kargs Oct 28 00:12:43.168756 ignition[905]: no configs at "/usr/lib/ignition/base.d" Oct 28 00:12:43.168764 ignition[905]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:12:43.169485 ignition[905]: kargs: kargs passed Oct 28 00:12:43.169518 ignition[905]: Ignition finished successfully Oct 28 00:12:43.171076 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Oct 28 00:12:43.171928 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Oct 28 00:12:43.197719 ignition[911]: Ignition 2.22.0 Oct 28 00:12:43.197735 ignition[911]: Stage: disks Oct 28 00:12:43.197825 ignition[911]: no configs at "/usr/lib/ignition/base.d" Oct 28 00:12:43.197830 ignition[911]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:12:43.198496 ignition[911]: disks: disks passed Oct 28 00:12:43.198536 ignition[911]: Ignition finished successfully Oct 28 00:12:43.199534 systemd[1]: Finished ignition-disks.service - Ignition (disks). Oct 28 00:12:43.199941 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Oct 28 00:12:43.200090 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Oct 28 00:12:43.200292 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 28 00:12:43.200494 systemd[1]: Reached target sysinit.target - System Initialization. Oct 28 00:12:43.200703 systemd[1]: Reached target basic.target - Basic System. Oct 28 00:12:43.201520 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Oct 28 00:12:43.272740 systemd-fsck[919]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Oct 28 00:12:43.274315 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Oct 28 00:12:43.276076 systemd[1]: Mounting sysroot.mount - /sysroot... Oct 28 00:12:43.401037 kernel: EXT4-fs (sda9): mounted filesystem b815ee5e-3be8-4bde-b70d-1e4425ecc899 r/w with ordered data mode. Quota mode: none. Oct 28 00:12:43.401841 systemd[1]: Mounted sysroot.mount - /sysroot. Oct 28 00:12:43.402158 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Oct 28 00:12:43.403756 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 28 00:12:43.406088 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Oct 28 00:12:43.406616 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Oct 28 00:12:43.406842 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Oct 28 00:12:43.407074 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Oct 28 00:12:43.415391 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Oct 28 00:12:43.417101 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Oct 28 00:12:43.422045 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (928) Oct 28 00:12:43.424941 kernel: BTRFS info (device sda6): first mount of filesystem e5ad038a-d5ed-4440-8f1c-902f5112301b Oct 28 00:12:43.424979 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:12:43.431567 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 28 00:12:43.431609 kernel: BTRFS info (device sda6): enabling free space tree Oct 28 00:12:43.432465 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 28 00:12:43.509129 initrd-setup-root[952]: cut: /sysroot/etc/passwd: No such file or directory Oct 28 00:12:43.512393 initrd-setup-root[959]: cut: /sysroot/etc/group: No such file or directory Oct 28 00:12:43.515251 initrd-setup-root[966]: cut: /sysroot/etc/shadow: No such file or directory Oct 28 00:12:43.518762 initrd-setup-root[973]: cut: /sysroot/etc/gshadow: No such file or directory Oct 28 00:12:43.624608 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Oct 28 00:12:43.625741 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Oct 28 00:12:43.628109 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Oct 28 00:12:43.637352 systemd[1]: sysroot-oem.mount: Deactivated successfully. Oct 28 00:12:43.639050 kernel: BTRFS info (device sda6): last unmount of filesystem e5ad038a-d5ed-4440-8f1c-902f5112301b Oct 28 00:12:43.662636 ignition[1042]: INFO : Ignition 2.22.0 Oct 28 00:12:43.662636 ignition[1042]: INFO : Stage: mount Oct 28 00:12:43.662971 ignition[1042]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 28 00:12:43.662971 ignition[1042]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:12:43.663236 ignition[1042]: INFO : mount: mount passed Oct 28 00:12:43.663236 ignition[1042]: INFO : Ignition finished successfully Oct 28 00:12:43.664086 systemd[1]: Finished ignition-mount.service - Ignition (mount). Oct 28 00:12:43.664934 systemd[1]: Starting ignition-files.service - Ignition (files)... Oct 28 00:12:43.667608 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Oct 28 00:12:44.402825 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 28 00:12:44.428063 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1052) Oct 28 00:12:44.430489 kernel: BTRFS info (device sda6): first mount of filesystem e5ad038a-d5ed-4440-8f1c-902f5112301b Oct 28 00:12:44.430546 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 28 00:12:44.434040 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 28 00:12:44.434073 kernel: BTRFS info (device sda6): enabling free space tree Oct 28 00:12:44.435247 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 28 00:12:44.454404 ignition[1068]: INFO : Ignition 2.22.0 Oct 28 00:12:44.454404 ignition[1068]: INFO : Stage: files Oct 28 00:12:44.454779 ignition[1068]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 28 00:12:44.454779 ignition[1068]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:12:44.455036 ignition[1068]: DEBUG : files: compiled without relabeling support, skipping Oct 28 00:12:44.467284 ignition[1068]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Oct 28 00:12:44.467284 ignition[1068]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Oct 28 00:12:44.485439 ignition[1068]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Oct 28 00:12:44.485653 ignition[1068]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Oct 28 00:12:44.485810 ignition[1068]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Oct 28 00:12:44.485690 unknown[1068]: wrote ssh authorized keys file for user: core Oct 28 00:12:44.496428 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 28 00:12:44.496787 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Oct 28 00:12:44.595944 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Oct 28 00:12:44.673077 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 28 00:12:44.673077 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Oct 28 00:12:44.673576 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Oct 28 00:12:44.673576 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Oct 28 00:12:44.673576 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Oct 28 00:12:44.673576 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 28 00:12:44.673576 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 28 00:12:44.673576 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 28 00:12:44.673576 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 28 00:12:44.688116 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Oct 28 00:12:44.688367 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Oct 28 00:12:44.688367 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 28 00:12:44.695143 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 28 00:12:44.695143 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 28 00:12:44.695721 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Oct 28 00:12:45.119796 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Oct 28 00:12:45.422065 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Oct 28 00:12:45.422065 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Oct 28 00:12:45.423225 ignition[1068]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Oct 28 00:12:45.423225 ignition[1068]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Oct 28 00:12:45.423674 ignition[1068]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 28 00:12:45.424131 ignition[1068]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 28 00:12:45.424131 ignition[1068]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Oct 28 00:12:45.424131 ignition[1068]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Oct 28 00:12:45.424131 ignition[1068]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Oct 28 00:12:45.424131 ignition[1068]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Oct 28 00:12:45.424131 ignition[1068]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Oct 28 00:12:45.424131 ignition[1068]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Oct 28 00:12:45.580376 ignition[1068]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Oct 28 00:12:45.584753 ignition[1068]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Oct 28 00:12:45.585020 ignition[1068]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Oct 28 00:12:45.585020 ignition[1068]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Oct 28 00:12:45.585020 ignition[1068]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Oct 28 00:12:45.586333 ignition[1068]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Oct 28 00:12:45.586333 ignition[1068]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Oct 28 00:12:45.586333 ignition[1068]: INFO : files: files passed Oct 28 00:12:45.586333 ignition[1068]: INFO : Ignition finished successfully Oct 28 00:12:45.586473 systemd[1]: Finished ignition-files.service - Ignition (files). Oct 28 00:12:45.587573 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Oct 28 00:12:45.589140 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Oct 28 00:12:45.599282 systemd[1]: ignition-quench.service: Deactivated successfully. Oct 28 00:12:45.599375 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Oct 28 00:12:45.601438 initrd-setup-root-after-ignition[1101]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 28 00:12:45.601438 initrd-setup-root-after-ignition[1101]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Oct 28 00:12:45.602639 initrd-setup-root-after-ignition[1105]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 28 00:12:45.603729 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 28 00:12:45.603965 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Oct 28 00:12:45.604600 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Oct 28 00:12:45.638500 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Oct 28 00:12:45.638575 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Oct 28 00:12:45.638859 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Oct 28 00:12:45.638988 systemd[1]: Reached target initrd.target - Initrd Default Target. Oct 28 00:12:45.639341 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Oct 28 00:12:45.639843 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Oct 28 00:12:45.655514 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 28 00:12:45.656321 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Oct 28 00:12:45.666837 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Oct 28 00:12:45.666938 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Oct 28 00:12:45.667164 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 28 00:12:45.667385 systemd[1]: Stopped target timers.target - Timer Units. Oct 28 00:12:45.667596 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Oct 28 00:12:45.667672 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 28 00:12:45.668039 systemd[1]: Stopped target initrd.target - Initrd Default Target. Oct 28 00:12:45.668219 systemd[1]: Stopped target basic.target - Basic System. Oct 28 00:12:45.668430 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Oct 28 00:12:45.668606 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Oct 28 00:12:45.668827 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Oct 28 00:12:45.669055 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Oct 28 00:12:45.669256 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Oct 28 00:12:45.669468 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Oct 28 00:12:45.669714 systemd[1]: Stopped target sysinit.target - System Initialization. Oct 28 00:12:45.669968 systemd[1]: Stopped target local-fs.target - Local File Systems. Oct 28 00:12:45.670209 systemd[1]: Stopped target swap.target - Swaps. Oct 28 00:12:45.670381 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Oct 28 00:12:45.670470 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Oct 28 00:12:45.670850 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Oct 28 00:12:45.671146 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 28 00:12:45.671320 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Oct 28 00:12:45.671388 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 28 00:12:45.671590 systemd[1]: dracut-initqueue.service: Deactivated successfully. Oct 28 00:12:45.671704 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Oct 28 00:12:45.672111 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Oct 28 00:12:45.672220 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Oct 28 00:12:45.672563 systemd[1]: Stopped target paths.target - Path Units. Oct 28 00:12:45.672734 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Oct 28 00:12:45.672820 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 28 00:12:45.673165 systemd[1]: Stopped target slices.target - Slice Units. Oct 28 00:12:45.673412 systemd[1]: Stopped target sockets.target - Socket Units. Oct 28 00:12:45.673612 systemd[1]: iscsid.socket: Deactivated successfully. Oct 28 00:12:45.673695 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Oct 28 00:12:45.673957 systemd[1]: iscsiuio.socket: Deactivated successfully. Oct 28 00:12:45.674048 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 28 00:12:45.674328 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Oct 28 00:12:45.674446 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 28 00:12:45.674708 systemd[1]: ignition-files.service: Deactivated successfully. Oct 28 00:12:45.674815 systemd[1]: Stopped ignition-files.service - Ignition (files). Oct 28 00:12:45.675767 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Oct 28 00:12:45.677179 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Oct 28 00:12:45.677338 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Oct 28 00:12:45.677457 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 28 00:12:45.677751 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Oct 28 00:12:45.677855 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Oct 28 00:12:45.680909 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Oct 28 00:12:45.681245 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Oct 28 00:12:45.684446 systemd[1]: initrd-cleanup.service: Deactivated successfully. Oct 28 00:12:45.685185 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Oct 28 00:12:45.696943 systemd[1]: sysroot-boot.mount: Deactivated successfully. Oct 28 00:12:45.700035 ignition[1126]: INFO : Ignition 2.22.0 Oct 28 00:12:45.700035 ignition[1126]: INFO : Stage: umount Oct 28 00:12:45.700035 ignition[1126]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 28 00:12:45.700035 ignition[1126]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 28 00:12:45.700843 ignition[1126]: INFO : umount: umount passed Oct 28 00:12:45.700843 ignition[1126]: INFO : Ignition finished successfully Oct 28 00:12:45.701866 systemd[1]: ignition-mount.service: Deactivated successfully. Oct 28 00:12:45.701949 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Oct 28 00:12:45.702416 systemd[1]: Stopped target network.target - Network. Oct 28 00:12:45.702905 systemd[1]: ignition-disks.service: Deactivated successfully. Oct 28 00:12:45.702952 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Oct 28 00:12:45.703102 systemd[1]: ignition-kargs.service: Deactivated successfully. Oct 28 00:12:45.703139 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Oct 28 00:12:45.703472 systemd[1]: ignition-setup.service: Deactivated successfully. Oct 28 00:12:45.703503 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Oct 28 00:12:45.703997 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Oct 28 00:12:45.704039 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Oct 28 00:12:45.704597 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Oct 28 00:12:45.705049 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Oct 28 00:12:45.712687 systemd[1]: systemd-resolved.service: Deactivated successfully. Oct 28 00:12:45.712755 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Oct 28 00:12:45.714010 systemd[1]: systemd-networkd.service: Deactivated successfully. Oct 28 00:12:45.714374 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Oct 28 00:12:45.715805 systemd[1]: Stopped target network-pre.target - Preparation for Network. Oct 28 00:12:45.716292 systemd[1]: systemd-networkd.socket: Deactivated successfully. Oct 28 00:12:45.716422 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Oct 28 00:12:45.717267 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Oct 28 00:12:45.717487 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Oct 28 00:12:45.717654 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 28 00:12:45.717935 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Oct 28 00:12:45.718075 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Oct 28 00:12:45.718502 systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 28 00:12:45.718533 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Oct 28 00:12:45.718888 systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 28 00:12:45.718913 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Oct 28 00:12:45.719316 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 28 00:12:45.730764 systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 28 00:12:45.731021 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 28 00:12:45.731416 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Oct 28 00:12:45.731442 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Oct 28 00:12:45.731787 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Oct 28 00:12:45.731907 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Oct 28 00:12:45.732163 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Oct 28 00:12:45.732291 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Oct 28 00:12:45.732580 systemd[1]: dracut-cmdline.service: Deactivated successfully. Oct 28 00:12:45.732604 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Oct 28 00:12:45.732965 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Oct 28 00:12:45.733092 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 28 00:12:45.733798 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Oct 28 00:12:45.734040 systemd[1]: systemd-network-generator.service: Deactivated successfully. Oct 28 00:12:45.734177 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Oct 28 00:12:45.734441 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Oct 28 00:12:45.734465 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 28 00:12:45.734711 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Oct 28 00:12:45.734735 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 28 00:12:45.734984 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Oct 28 00:12:45.735007 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Oct 28 00:12:45.735653 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 28 00:12:45.735682 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:12:45.746295 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Oct 28 00:12:45.746362 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Oct 28 00:12:45.761001 systemd[1]: network-cleanup.service: Deactivated successfully. Oct 28 00:12:45.761105 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Oct 28 00:12:45.841085 systemd[1]: sysroot-boot.service: Deactivated successfully. Oct 28 00:12:45.841200 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Oct 28 00:12:45.841644 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Oct 28 00:12:45.841803 systemd[1]: initrd-setup-root.service: Deactivated successfully. Oct 28 00:12:45.841841 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Oct 28 00:12:45.842678 systemd[1]: Starting initrd-switch-root.service - Switch Root... Oct 28 00:12:45.877482 systemd[1]: Switching root. Oct 28 00:12:45.915450 systemd-journald[326]: Journal stopped Oct 28 00:12:47.864616 systemd-journald[326]: Received SIGTERM from PID 1 (systemd). Oct 28 00:12:47.864641 kernel: SELinux: policy capability network_peer_controls=1 Oct 28 00:12:47.864651 kernel: SELinux: policy capability open_perms=1 Oct 28 00:12:47.864658 kernel: SELinux: policy capability extended_socket_class=1 Oct 28 00:12:47.864664 kernel: SELinux: policy capability always_check_network=0 Oct 28 00:12:47.864670 kernel: SELinux: policy capability cgroup_seclabel=1 Oct 28 00:12:47.864679 kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 28 00:12:47.864685 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Oct 28 00:12:47.864692 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Oct 28 00:12:47.864698 kernel: SELinux: policy capability userspace_initial_context=0 Oct 28 00:12:47.864704 kernel: audit: type=1403 audit(1761610366.905:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Oct 28 00:12:47.864712 systemd[1]: Successfully loaded SELinux policy in 52.705ms. Oct 28 00:12:47.864722 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.188ms. Oct 28 00:12:47.864730 systemd[1]: systemd 257.7 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 28 00:12:47.864738 systemd[1]: Detected virtualization vmware. Oct 28 00:12:47.864747 systemd[1]: Detected architecture x86-64. Oct 28 00:12:47.864754 systemd[1]: Detected first boot. Oct 28 00:12:47.864761 systemd[1]: Initializing machine ID from random generator. Oct 28 00:12:47.864769 zram_generator::config[1169]: No configuration found. Oct 28 00:12:47.864884 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Oct 28 00:12:47.864898 kernel: Guest personality initialized and is active Oct 28 00:12:47.864905 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Oct 28 00:12:47.864912 kernel: Initialized host personality Oct 28 00:12:47.864919 kernel: NET: Registered PF_VSOCK protocol family Oct 28 00:12:47.864926 systemd[1]: Populated /etc with preset unit settings. Oct 28 00:12:47.864935 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:12:47.864944 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Oct 28 00:12:47.864952 systemd[1]: initrd-switch-root.service: Deactivated successfully. Oct 28 00:12:47.864959 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Oct 28 00:12:47.864967 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Oct 28 00:12:47.864974 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Oct 28 00:12:47.864982 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Oct 28 00:12:47.864991 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Oct 28 00:12:47.864998 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Oct 28 00:12:47.865006 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Oct 28 00:12:47.865013 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Oct 28 00:12:47.865021 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Oct 28 00:12:47.865687 systemd[1]: Created slice user.slice - User and Session Slice. Oct 28 00:12:47.865701 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 28 00:12:47.865709 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 28 00:12:47.865720 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Oct 28 00:12:47.865727 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Oct 28 00:12:47.865735 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Oct 28 00:12:47.865743 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 28 00:12:47.865752 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Oct 28 00:12:47.865761 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 28 00:12:47.865769 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 28 00:12:47.865777 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Oct 28 00:12:47.865784 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Oct 28 00:12:47.865792 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Oct 28 00:12:47.865800 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Oct 28 00:12:47.865809 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 28 00:12:47.865818 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 28 00:12:47.865825 systemd[1]: Reached target slices.target - Slice Units. Oct 28 00:12:47.865833 systemd[1]: Reached target swap.target - Swaps. Oct 28 00:12:47.865841 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Oct 28 00:12:47.865849 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Oct 28 00:12:47.865858 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Oct 28 00:12:47.865866 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 28 00:12:47.865874 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 28 00:12:47.865882 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 28 00:12:47.865891 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Oct 28 00:12:47.865899 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Oct 28 00:12:47.865907 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Oct 28 00:12:47.865915 systemd[1]: Mounting media.mount - External Media Directory... Oct 28 00:12:47.865923 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:47.865932 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Oct 28 00:12:47.865939 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Oct 28 00:12:47.865948 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Oct 28 00:12:47.865956 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Oct 28 00:12:47.865964 systemd[1]: Reached target machines.target - Containers. Oct 28 00:12:47.865972 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Oct 28 00:12:47.865980 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Oct 28 00:12:47.865987 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 28 00:12:47.865995 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Oct 28 00:12:47.866004 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 28 00:12:47.866012 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 28 00:12:47.866020 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 28 00:12:47.866062 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 28 00:12:47.866071 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 28 00:12:47.866079 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Oct 28 00:12:47.866088 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Oct 28 00:12:47.866096 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Oct 28 00:12:47.866104 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Oct 28 00:12:47.866112 systemd[1]: Stopped systemd-fsck-usr.service. Oct 28 00:12:47.866120 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 28 00:12:47.866128 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 28 00:12:47.866136 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 28 00:12:47.866145 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 28 00:12:47.866153 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Oct 28 00:12:47.866161 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Oct 28 00:12:47.866169 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 28 00:12:47.866177 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:47.866185 kernel: fuse: init (API version 7.41) Oct 28 00:12:47.866193 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Oct 28 00:12:47.866203 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Oct 28 00:12:47.866210 systemd[1]: Mounted media.mount - External Media Directory. Oct 28 00:12:47.866218 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Oct 28 00:12:47.866226 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Oct 28 00:12:47.866233 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Oct 28 00:12:47.866241 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 28 00:12:47.866250 systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 28 00:12:47.866258 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Oct 28 00:12:47.866266 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 28 00:12:47.866274 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 28 00:12:47.866282 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 28 00:12:47.866290 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 28 00:12:47.866298 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 28 00:12:47.866307 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 28 00:12:47.866315 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 28 00:12:47.866323 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 28 00:12:47.866330 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 28 00:12:47.866338 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Oct 28 00:12:47.866346 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 28 00:12:47.866355 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Oct 28 00:12:47.866364 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Oct 28 00:12:47.866372 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Oct 28 00:12:47.866380 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Oct 28 00:12:47.866391 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 28 00:12:47.866400 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Oct 28 00:12:47.866408 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 28 00:12:47.866434 systemd-journald[1259]: Collecting audit messages is disabled. Oct 28 00:12:47.866455 systemd-journald[1259]: Journal started Oct 28 00:12:47.866473 systemd-journald[1259]: Runtime Journal (/run/log/journal/6d62f41016254adaaa1a82ef3f806c86) is 4.8M, max 38.5M, 33.7M free. Oct 28 00:12:47.648289 systemd[1]: Queued start job for default target multi-user.target. Oct 28 00:12:47.653855 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Oct 28 00:12:47.654127 systemd[1]: systemd-journald.service: Deactivated successfully. Oct 28 00:12:47.867002 jq[1239]: true Oct 28 00:12:47.867556 jq[1271]: true Oct 28 00:12:47.875166 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Oct 28 00:12:47.875211 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 28 00:12:47.881047 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Oct 28 00:12:47.881086 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 28 00:12:47.885062 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Oct 28 00:12:47.901988 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 28 00:12:47.902060 systemd[1]: Started systemd-journald.service - Journal Service. Oct 28 00:12:47.903200 kernel: ACPI: bus type drm_connector registered Oct 28 00:12:47.906154 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Oct 28 00:12:47.906418 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 28 00:12:47.906536 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 28 00:12:47.906823 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 28 00:12:47.907261 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Oct 28 00:12:47.908589 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Oct 28 00:12:47.910293 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Oct 28 00:12:47.912293 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Oct 28 00:12:47.920049 kernel: loop1: detected capacity change from 0 to 219144 Oct 28 00:12:47.926619 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Oct 28 00:12:47.929121 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Oct 28 00:12:47.931452 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Oct 28 00:12:47.933840 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 28 00:12:47.955226 ignition[1293]: Ignition 2.22.0 Oct 28 00:12:47.955440 ignition[1293]: deleting config from guestinfo properties Oct 28 00:12:48.063762 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 28 00:12:48.067294 ignition[1293]: Successfully deleted config Oct 28 00:12:48.068982 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Oct 28 00:12:48.070268 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Oct 28 00:12:48.070282 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Oct 28 00:12:48.075142 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 28 00:12:48.078910 systemd-journald[1259]: Time spent on flushing to /var/log/journal/6d62f41016254adaaa1a82ef3f806c86 is 21.566ms for 1766 entries. Oct 28 00:12:48.078910 systemd-journald[1259]: System Journal (/var/log/journal/6d62f41016254adaaa1a82ef3f806c86) is 8M, max 588.1M, 580.1M free. Oct 28 00:12:48.110552 systemd-journald[1259]: Received client request to flush runtime journal. Oct 28 00:12:48.078232 systemd[1]: Starting systemd-sysusers.service - Create System Users... Oct 28 00:12:48.103306 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 28 00:12:48.111803 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Oct 28 00:12:48.119082 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Oct 28 00:12:48.139036 kernel: loop2: detected capacity change from 0 to 110976 Oct 28 00:12:48.145145 systemd[1]: Finished systemd-sysusers.service - Create System Users. Oct 28 00:12:48.149163 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 28 00:12:48.152985 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 28 00:12:48.166583 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Oct 28 00:12:48.169095 kernel: loop3: detected capacity change from 0 to 128048 Oct 28 00:12:48.178091 systemd-tmpfiles[1340]: ACLs are not supported, ignoring. Oct 28 00:12:48.178107 systemd-tmpfiles[1340]: ACLs are not supported, ignoring. Oct 28 00:12:48.181170 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 28 00:12:48.198795 systemd[1]: Started systemd-userdbd.service - User Database Manager. Oct 28 00:12:48.257277 systemd-resolved[1339]: Positive Trust Anchors: Oct 28 00:12:48.257288 systemd-resolved[1339]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 28 00:12:48.257291 systemd-resolved[1339]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Oct 28 00:12:48.257314 systemd-resolved[1339]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 28 00:12:48.260193 systemd-resolved[1339]: Defaulting to hostname 'linux'. Oct 28 00:12:48.261106 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 28 00:12:48.261284 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 28 00:12:48.276043 kernel: loop4: detected capacity change from 0 to 2960 Oct 28 00:12:48.297048 kernel: loop5: detected capacity change from 0 to 219144 Oct 28 00:12:48.439045 kernel: loop6: detected capacity change from 0 to 110976 Oct 28 00:12:48.581048 kernel: loop7: detected capacity change from 0 to 128048 Oct 28 00:12:48.654994 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Oct 28 00:12:48.756876 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Oct 28 00:12:48.758781 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 28 00:12:48.787834 systemd-udevd[1355]: Using default interface naming scheme 'v257'. Oct 28 00:12:48.834041 kernel: loop1: detected capacity change from 0 to 2960 Oct 28 00:12:48.977059 (sd-merge)[1353]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-vmware.raw'. Oct 28 00:12:48.979238 (sd-merge)[1353]: Merged extensions into '/usr'. Oct 28 00:12:48.984413 systemd[1]: Reload requested from client PID 1291 ('systemd-sysext') (unit systemd-sysext.service)... Oct 28 00:12:48.984527 systemd[1]: Reloading... Oct 28 00:12:49.060037 zram_generator::config[1400]: No configuration found. Oct 28 00:12:49.090246 kernel: mousedev: PS/2 mouse device common for all mice Oct 28 00:12:49.090285 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Oct 28 00:12:49.103043 kernel: ACPI: button: Power Button [PWRF] Oct 28 00:12:49.203517 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:12:49.246069 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Oct 28 00:12:49.265557 (udev-worker)[1378]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Oct 28 00:12:49.273615 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Oct 28 00:12:49.274041 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Oct 28 00:12:49.274222 systemd[1]: Reloading finished in 289 ms. Oct 28 00:12:49.288687 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 28 00:12:49.298754 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Oct 28 00:12:49.320992 systemd[1]: Starting ensure-sysext.service... Oct 28 00:12:49.324116 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Oct 28 00:12:49.330117 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 28 00:12:49.336445 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 28 00:12:49.340616 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 28 00:12:49.353467 systemd[1]: Reload requested from client PID 1479 ('systemctl') (unit ensure-sysext.service)... Oct 28 00:12:49.353476 systemd[1]: Reloading... Oct 28 00:12:49.380718 systemd-tmpfiles[1482]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Oct 28 00:12:49.380738 systemd-tmpfiles[1482]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Oct 28 00:12:49.380888 systemd-tmpfiles[1482]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Oct 28 00:12:49.381542 systemd-tmpfiles[1482]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Oct 28 00:12:49.383118 systemd-tmpfiles[1482]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Oct 28 00:12:49.383373 systemd-tmpfiles[1482]: ACLs are not supported, ignoring. Oct 28 00:12:49.383517 systemd-tmpfiles[1482]: ACLs are not supported, ignoring. Oct 28 00:12:49.393045 systemd-tmpfiles[1482]: Detected autofs mount point /boot during canonicalization of boot. Oct 28 00:12:49.393118 systemd-tmpfiles[1482]: Skipping /boot Oct 28 00:12:49.402499 systemd-tmpfiles[1482]: Detected autofs mount point /boot during canonicalization of boot. Oct 28 00:12:49.402582 systemd-tmpfiles[1482]: Skipping /boot Oct 28 00:12:49.414050 zram_generator::config[1519]: No configuration found. Oct 28 00:12:49.443467 systemd-networkd[1481]: lo: Link UP Oct 28 00:12:49.443472 systemd-networkd[1481]: lo: Gained carrier Oct 28 00:12:49.444414 systemd-networkd[1481]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Oct 28 00:12:49.447090 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Oct 28 00:12:49.448050 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Oct 28 00:12:49.447572 systemd-networkd[1481]: ens192: Link UP Oct 28 00:12:49.447697 systemd-networkd[1481]: ens192: Gained carrier Oct 28 00:12:49.505937 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:12:49.569764 systemd[1]: Reloading finished in 216 ms. Oct 28 00:12:49.587126 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 28 00:12:49.598370 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Oct 28 00:12:49.598845 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 28 00:12:49.600017 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 28 00:12:49.605449 systemd[1]: Reached target network.target - Network. Oct 28 00:12:49.606679 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 28 00:12:49.607828 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Oct 28 00:12:49.613849 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Oct 28 00:12:49.614964 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Oct 28 00:12:49.616219 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Oct 28 00:12:49.620320 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Oct 28 00:12:49.624077 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Oct 28 00:12:49.627040 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:49.633349 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 28 00:12:49.637842 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 28 00:12:49.641692 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 28 00:12:49.641895 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 28 00:12:49.641979 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 28 00:12:49.642079 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:49.645126 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:49.645251 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 28 00:12:49.645319 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 28 00:12:49.645403 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:49.648987 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:49.654417 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 28 00:12:49.654620 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 28 00:12:49.654692 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 28 00:12:49.654788 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 28 00:12:49.655249 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Oct 28 00:12:49.665182 systemd[1]: Finished ensure-sysext.service. Oct 28 00:12:49.665607 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Oct 28 00:12:49.665871 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 28 00:12:49.666189 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 28 00:12:49.670182 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Oct 28 00:12:49.670508 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 28 00:12:49.675371 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 28 00:12:49.675791 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 28 00:12:49.675924 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 28 00:12:49.676774 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 28 00:12:49.676907 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 28 00:12:49.677782 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 28 00:12:49.677842 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 28 00:12:49.724117 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Oct 28 00:12:49.724320 systemd[1]: Reached target time-set.target - System Time Set. Oct 28 00:12:49.761102 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Oct 28 00:12:49.852739 augenrules[1626]: No rules Oct 28 00:12:49.853379 systemd[1]: audit-rules.service: Deactivated successfully. Oct 28 00:12:49.853655 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 28 00:14:27.733728 systemd-resolved[1339]: Clock change detected. Flushing caches. Oct 28 00:14:27.733816 systemd-timesyncd[1608]: Contacted time server 144.202.0.197:123 (0.flatcar.pool.ntp.org). Oct 28 00:14:27.733848 systemd-timesyncd[1608]: Initial clock synchronization to Tue 2025-10-28 00:14:27.733690 UTC. Oct 28 00:14:27.743184 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Oct 28 00:14:27.743406 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 28 00:14:28.221039 ldconfig[1583]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Oct 28 00:14:28.223779 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Oct 28 00:14:28.225014 systemd[1]: Starting systemd-update-done.service - Update is Completed... Oct 28 00:14:28.237392 systemd[1]: Finished systemd-update-done.service - Update is Completed. Oct 28 00:14:28.237667 systemd[1]: Reached target sysinit.target - System Initialization. Oct 28 00:14:28.237852 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Oct 28 00:14:28.237985 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Oct 28 00:14:28.238105 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Oct 28 00:14:28.238396 systemd[1]: Started logrotate.timer - Daily rotation of log files. Oct 28 00:14:28.238627 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Oct 28 00:14:28.238747 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Oct 28 00:14:28.238861 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Oct 28 00:14:28.238883 systemd[1]: Reached target paths.target - Path Units. Oct 28 00:14:28.238971 systemd[1]: Reached target timers.target - Timer Units. Oct 28 00:14:28.239825 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Oct 28 00:14:28.241481 systemd[1]: Starting docker.socket - Docker Socket for the API... Oct 28 00:14:28.243000 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Oct 28 00:14:28.243205 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Oct 28 00:14:28.243326 systemd[1]: Reached target ssh-access.target - SSH Access Available. Oct 28 00:14:28.247711 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Oct 28 00:14:28.248046 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Oct 28 00:14:28.248597 systemd[1]: Listening on docker.socket - Docker Socket for the API. Oct 28 00:14:28.249248 systemd[1]: Reached target sockets.target - Socket Units. Oct 28 00:14:28.249345 systemd[1]: Reached target basic.target - Basic System. Oct 28 00:14:28.249464 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Oct 28 00:14:28.249484 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Oct 28 00:14:28.250354 systemd[1]: Starting containerd.service - containerd container runtime... Oct 28 00:14:28.252613 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Oct 28 00:14:28.254402 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Oct 28 00:14:28.255750 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Oct 28 00:14:28.258516 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Oct 28 00:14:28.258648 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Oct 28 00:14:28.259817 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Oct 28 00:14:28.271370 jq[1641]: false Oct 28 00:14:28.271772 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Oct 28 00:14:28.273336 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Oct 28 00:14:28.277950 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Refreshing passwd entry cache Oct 28 00:14:28.277952 oslogin_cache_refresh[1643]: Refreshing passwd entry cache Oct 28 00:14:28.279675 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Oct 28 00:14:28.281133 extend-filesystems[1642]: Found /dev/sda6 Oct 28 00:14:28.283972 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Oct 28 00:14:28.285822 extend-filesystems[1642]: Found /dev/sda9 Oct 28 00:14:28.285943 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Failure getting users, quitting Oct 28 00:14:28.285937 oslogin_cache_refresh[1643]: Failure getting users, quitting Oct 28 00:14:28.285985 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 28 00:14:28.285985 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Refreshing group entry cache Oct 28 00:14:28.285950 oslogin_cache_refresh[1643]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 28 00:14:28.285979 oslogin_cache_refresh[1643]: Refreshing group entry cache Oct 28 00:14:28.286627 systemd[1]: Starting systemd-logind.service - User Login Management... Oct 28 00:14:28.287269 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Oct 28 00:14:28.287801 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Oct 28 00:14:28.291970 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Failure getting groups, quitting Oct 28 00:14:28.291970 google_oslogin_nss_cache[1643]: oslogin_cache_refresh[1643]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 28 00:14:28.289511 systemd[1]: Starting update-engine.service - Update Engine... Oct 28 00:14:28.288664 oslogin_cache_refresh[1643]: Failure getting groups, quitting Oct 28 00:14:28.288672 oslogin_cache_refresh[1643]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 28 00:14:28.292346 extend-filesystems[1642]: Checking size of /dev/sda9 Oct 28 00:14:28.296703 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Oct 28 00:14:28.298721 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Oct 28 00:14:28.301871 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Oct 28 00:14:28.302150 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Oct 28 00:14:28.302278 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Oct 28 00:14:28.302424 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Oct 28 00:14:28.302559 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Oct 28 00:14:28.317616 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Oct 28 00:14:28.319672 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Oct 28 00:14:28.324186 update_engine[1653]: I20251028 00:14:28.324043 1653 main.cc:92] Flatcar Update Engine starting Oct 28 00:14:28.323024 systemd[1]: motdgen.service: Deactivated successfully. Oct 28 00:14:28.323407 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Oct 28 00:14:28.328127 extend-filesystems[1642]: Resized partition /dev/sda9 Oct 28 00:14:28.330627 jq[1656]: true Oct 28 00:14:28.338634 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Oct 28 00:14:28.345731 extend-filesystems[1688]: resize2fs 1.47.3 (8-Jul-2025) Oct 28 00:14:28.346314 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Oct 28 00:14:28.354955 (ntainerd)[1690]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Oct 28 00:14:28.407160 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 1635323 blocks Oct 28 00:14:28.385297 unknown[1689]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Oct 28 00:14:28.407387 jq[1686]: true Oct 28 00:14:28.409175 kernel: EXT4-fs (sda9): resized filesystem to 1635323 Oct 28 00:14:28.386792 unknown[1689]: Core dump limit set to -1 Oct 28 00:14:28.420149 tar[1664]: linux-amd64/LICENSE Oct 28 00:14:28.420326 extend-filesystems[1688]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Oct 28 00:14:28.420326 extend-filesystems[1688]: old_desc_blocks = 1, new_desc_blocks = 1 Oct 28 00:14:28.420326 extend-filesystems[1688]: The filesystem on /dev/sda9 is now 1635323 (4k) blocks long. Oct 28 00:14:28.421406 tar[1664]: linux-amd64/helm Oct 28 00:14:28.421425 extend-filesystems[1642]: Resized filesystem in /dev/sda9 Oct 28 00:14:28.422839 systemd[1]: extend-filesystems.service: Deactivated successfully. Oct 28 00:14:28.428933 bash[1712]: Updated "/home/core/.ssh/authorized_keys" Oct 28 00:14:28.422985 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Oct 28 00:14:28.426952 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Oct 28 00:14:28.427387 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Oct 28 00:14:28.429404 systemd-logind[1652]: Watching system buttons on /dev/input/event2 (Power Button) Oct 28 00:14:28.429418 systemd-logind[1652]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Oct 28 00:14:28.430992 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Oct 28 00:14:28.432154 systemd-logind[1652]: New seat seat0. Oct 28 00:14:28.436328 systemd[1]: Started systemd-logind.service - User Login Management. Oct 28 00:14:28.461516 dbus-daemon[1639]: [system] SELinux support is enabled Oct 28 00:14:28.461661 systemd[1]: Started dbus.service - D-Bus System Message Bus. Oct 28 00:14:28.463442 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Oct 28 00:14:28.463462 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Oct 28 00:14:28.464759 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Oct 28 00:14:28.464772 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Oct 28 00:14:28.479104 systemd[1]: Started update-engine.service - Update Engine. Oct 28 00:14:28.480821 update_engine[1653]: I20251028 00:14:28.479977 1653 update_check_scheduler.cc:74] Next update check in 7m51s Oct 28 00:14:28.495293 systemd[1]: Started locksmithd.service - Cluster reboot manager. Oct 28 00:14:28.654054 locksmithd[1722]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Oct 28 00:14:28.658165 sshd_keygen[1681]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Oct 28 00:14:28.673872 systemd-networkd[1481]: ens192: Gained IPv6LL Oct 28 00:14:28.676690 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Oct 28 00:14:28.677139 systemd[1]: Reached target network-online.target - Network is Online. Oct 28 00:14:28.679995 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Oct 28 00:14:28.684875 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:14:28.692776 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Oct 28 00:14:28.705599 containerd[1690]: time="2025-10-28T00:14:28Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Oct 28 00:14:28.706890 containerd[1690]: time="2025-10-28T00:14:28.706869129Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Oct 28 00:14:28.718076 containerd[1690]: time="2025-10-28T00:14:28.718044437Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.807µs" Oct 28 00:14:28.718162 containerd[1690]: time="2025-10-28T00:14:28.718150576Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Oct 28 00:14:28.718203 containerd[1690]: time="2025-10-28T00:14:28.718195515Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Oct 28 00:14:28.718353 containerd[1690]: time="2025-10-28T00:14:28.718343327Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719204071Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719241279Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719293912Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719305029Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719432877Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719445764Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719457427Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719463330Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719551245Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719669798Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719771 containerd[1690]: time="2025-10-28T00:14:28.719688806Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 28 00:14:28.719615 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Oct 28 00:14:28.719987 containerd[1690]: time="2025-10-28T00:14:28.719697437Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Oct 28 00:14:28.719987 containerd[1690]: time="2025-10-28T00:14:28.719716239Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Oct 28 00:14:28.721778 containerd[1690]: time="2025-10-28T00:14:28.721752759Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Oct 28 00:14:28.722252 containerd[1690]: time="2025-10-28T00:14:28.722239578Z" level=info msg="metadata content store policy set" policy=shared Oct 28 00:14:28.725755 systemd[1]: Starting issuegen.service - Generate /run/issue... Oct 28 00:14:28.745695 systemd[1]: issuegen.service: Deactivated successfully. Oct 28 00:14:28.745878 systemd[1]: Finished issuegen.service - Generate /run/issue. Oct 28 00:14:28.747443 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Oct 28 00:14:28.770873 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Oct 28 00:14:28.772175 systemd[1]: Started getty@tty1.service - Getty on tty1. Oct 28 00:14:28.774664 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Oct 28 00:14:28.774879 systemd[1]: Reached target getty.target - Login Prompts. Oct 28 00:14:28.782603 systemd[1]: coreos-metadata.service: Deactivated successfully. Oct 28 00:14:28.782764 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Oct 28 00:14:28.783092 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788649383Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788695058Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788717562Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788725888Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788733781Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788740055Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788748436Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788756922Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788763878Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788770542Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788775712Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788785584Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788866118Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Oct 28 00:14:28.789901 containerd[1690]: time="2025-10-28T00:14:28.788878331Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788887095Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788895859Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788902269Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788908625Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788915904Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788925084Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788933402Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788939314Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.788944978Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.789034270Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.789049535Z" level=info msg="Start snapshots syncer" Oct 28 00:14:28.790114 containerd[1690]: time="2025-10-28T00:14:28.789320889Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Oct 28 00:14:28.790265 containerd[1690]: time="2025-10-28T00:14:28.789962703Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Oct 28 00:14:28.790265 containerd[1690]: time="2025-10-28T00:14:28.790003473Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790082985Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790195164Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790214615Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790224041Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790230752Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790237593Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790243530Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790256405Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790274935Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790284027Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790290457Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790313672Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790324084Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 28 00:14:28.790356 containerd[1690]: time="2025-10-28T00:14:28.790329182Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790334335Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790338774Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790345704Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790352723Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790361991Z" level=info msg="runtime interface created" Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790365354Z" level=info msg="created NRI interface" Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790370049Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790376626Z" level=info msg="Connect containerd service" Oct 28 00:14:28.790554 containerd[1690]: time="2025-10-28T00:14:28.790395375Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Oct 28 00:14:28.792001 containerd[1690]: time="2025-10-28T00:14:28.791789400Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Oct 28 00:14:28.811894 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Oct 28 00:14:28.921311 containerd[1690]: time="2025-10-28T00:14:28.921199611Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Oct 28 00:14:28.921311 containerd[1690]: time="2025-10-28T00:14:28.921243224Z" level=info msg=serving... address=/run/containerd/containerd.sock Oct 28 00:14:28.921311 containerd[1690]: time="2025-10-28T00:14:28.921259395Z" level=info msg="Start subscribing containerd event" Oct 28 00:14:28.921311 containerd[1690]: time="2025-10-28T00:14:28.921274013Z" level=info msg="Start recovering state" Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921326961Z" level=info msg="Start event monitor" Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921334964Z" level=info msg="Start cni network conf syncer for default" Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921342153Z" level=info msg="Start streaming server" Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921348114Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921352000Z" level=info msg="runtime interface starting up..." Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921354966Z" level=info msg="starting plugins..." Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921362111Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Oct 28 00:14:28.921432 containerd[1690]: time="2025-10-28T00:14:28.921424157Z" level=info msg="containerd successfully booted in 0.216161s" Oct 28 00:14:28.921591 systemd[1]: Started containerd.service - containerd container runtime. Oct 28 00:14:28.927903 tar[1664]: linux-amd64/README.md Oct 28 00:14:28.938106 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Oct 28 00:14:29.750861 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:14:29.751215 systemd[1]: Reached target multi-user.target - Multi-User System. Oct 28 00:14:29.751731 systemd[1]: Startup finished in 2.538s (kernel) + 6.265s (initrd) + 5.069s (userspace) = 13.874s. Oct 28 00:14:29.756753 (kubelet)[1845]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:14:30.123442 login[1813]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 28 00:14:30.124475 login[1814]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 28 00:14:30.130166 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Oct 28 00:14:30.131584 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Oct 28 00:14:30.136408 systemd-logind[1652]: New session 1 of user core. Oct 28 00:14:30.138364 systemd-logind[1652]: New session 2 of user core. Oct 28 00:14:30.151185 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Oct 28 00:14:30.154546 systemd[1]: Starting user@500.service - User Manager for UID 500... Oct 28 00:14:30.162970 (systemd)[1857]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Oct 28 00:14:30.165507 systemd-logind[1652]: New session c1 of user core. Oct 28 00:14:30.252644 systemd[1857]: Queued start job for default target default.target. Oct 28 00:14:30.258733 systemd[1857]: Created slice app.slice - User Application Slice. Oct 28 00:14:30.258756 systemd[1857]: Reached target paths.target - Paths. Oct 28 00:14:30.258850 systemd[1857]: Reached target timers.target - Timers. Oct 28 00:14:30.260976 systemd[1857]: Starting dbus.socket - D-Bus User Message Bus Socket... Oct 28 00:14:30.261734 kubelet[1845]: E1028 00:14:30.261710 1845 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:14:30.263137 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:14:30.263230 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:14:30.263439 systemd[1]: kubelet.service: Consumed 592ms CPU time, 257.5M memory peak. Oct 28 00:14:30.267814 systemd[1857]: Listening on dbus.socket - D-Bus User Message Bus Socket. Oct 28 00:14:30.267851 systemd[1857]: Reached target sockets.target - Sockets. Oct 28 00:14:30.267875 systemd[1857]: Reached target basic.target - Basic System. Oct 28 00:14:30.267896 systemd[1857]: Reached target default.target - Main User Target. Oct 28 00:14:30.267912 systemd[1857]: Startup finished in 97ms. Oct 28 00:14:30.268010 systemd[1]: Started user@500.service - User Manager for UID 500. Oct 28 00:14:30.275657 systemd[1]: Started session-1.scope - Session 1 of User core. Oct 28 00:14:30.276267 systemd[1]: Started session-2.scope - Session 2 of User core. Oct 28 00:14:40.513589 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Oct 28 00:14:40.514861 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:14:41.134934 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:14:41.139664 (kubelet)[1895]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:14:41.177668 kubelet[1895]: E1028 00:14:41.177635 1895 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:14:41.179977 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:14:41.180124 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:14:41.180517 systemd[1]: kubelet.service: Consumed 105ms CPU time, 110.5M memory peak. Oct 28 00:14:51.430605 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Oct 28 00:14:51.432174 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:14:51.701246 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:14:51.704769 (kubelet)[1910]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:14:51.727212 kubelet[1910]: E1028 00:14:51.727169 1910 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:14:51.728345 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:14:51.728430 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:14:51.728658 systemd[1]: kubelet.service: Consumed 101ms CPU time, 110.2M memory peak. Oct 28 00:14:58.511125 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Oct 28 00:14:58.512194 systemd[1]: Started sshd@0-139.178.70.103:22-139.178.68.195:58732.service - OpenSSH per-connection server daemon (139.178.68.195:58732). Oct 28 00:14:58.563807 sshd[1917]: Accepted publickey for core from 139.178.68.195 port 58732 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:14:58.564562 sshd-session[1917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:14:58.567391 systemd-logind[1652]: New session 3 of user core. Oct 28 00:14:58.577591 systemd[1]: Started session-3.scope - Session 3 of User core. Oct 28 00:14:58.631646 systemd[1]: Started sshd@1-139.178.70.103:22-139.178.68.195:58736.service - OpenSSH per-connection server daemon (139.178.68.195:58736). Oct 28 00:14:58.669695 sshd[1923]: Accepted publickey for core from 139.178.68.195 port 58736 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:14:58.670565 sshd-session[1923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:14:58.674368 systemd-logind[1652]: New session 4 of user core. Oct 28 00:14:58.682591 systemd[1]: Started session-4.scope - Session 4 of User core. Oct 28 00:14:58.730082 sshd[1926]: Connection closed by 139.178.68.195 port 58736 Oct 28 00:14:58.730373 sshd-session[1923]: pam_unix(sshd:session): session closed for user core Oct 28 00:14:58.735709 systemd[1]: sshd@1-139.178.70.103:22-139.178.68.195:58736.service: Deactivated successfully. Oct 28 00:14:58.736711 systemd[1]: session-4.scope: Deactivated successfully. Oct 28 00:14:58.737253 systemd-logind[1652]: Session 4 logged out. Waiting for processes to exit. Oct 28 00:14:58.738740 systemd[1]: Started sshd@2-139.178.70.103:22-139.178.68.195:58748.service - OpenSSH per-connection server daemon (139.178.68.195:58748). Oct 28 00:14:58.741053 systemd-logind[1652]: Removed session 4. Oct 28 00:14:58.771189 sshd[1932]: Accepted publickey for core from 139.178.68.195 port 58748 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:14:58.771756 sshd-session[1932]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:14:58.774482 systemd-logind[1652]: New session 5 of user core. Oct 28 00:14:58.780596 systemd[1]: Started session-5.scope - Session 5 of User core. Oct 28 00:14:58.825945 sshd[1935]: Connection closed by 139.178.68.195 port 58748 Oct 28 00:14:58.826329 sshd-session[1932]: pam_unix(sshd:session): session closed for user core Oct 28 00:14:58.833100 systemd[1]: sshd@2-139.178.70.103:22-139.178.68.195:58748.service: Deactivated successfully. Oct 28 00:14:58.834240 systemd[1]: session-5.scope: Deactivated successfully. Oct 28 00:14:58.835591 systemd-logind[1652]: Session 5 logged out. Waiting for processes to exit. Oct 28 00:14:58.836288 systemd[1]: Started sshd@3-139.178.70.103:22-139.178.68.195:58756.service - OpenSSH per-connection server daemon (139.178.68.195:58756). Oct 28 00:14:58.837122 systemd-logind[1652]: Removed session 5. Oct 28 00:14:58.871045 sshd[1941]: Accepted publickey for core from 139.178.68.195 port 58756 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:14:58.871833 sshd-session[1941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:14:58.874557 systemd-logind[1652]: New session 6 of user core. Oct 28 00:14:58.883641 systemd[1]: Started session-6.scope - Session 6 of User core. Oct 28 00:14:58.932610 sshd[1944]: Connection closed by 139.178.68.195 port 58756 Oct 28 00:14:58.933003 sshd-session[1941]: pam_unix(sshd:session): session closed for user core Oct 28 00:14:58.938891 systemd[1]: sshd@3-139.178.70.103:22-139.178.68.195:58756.service: Deactivated successfully. Oct 28 00:14:58.939894 systemd[1]: session-6.scope: Deactivated successfully. Oct 28 00:14:58.940420 systemd-logind[1652]: Session 6 logged out. Waiting for processes to exit. Oct 28 00:14:58.941932 systemd[1]: Started sshd@4-139.178.70.103:22-139.178.68.195:58762.service - OpenSSH per-connection server daemon (139.178.68.195:58762). Oct 28 00:14:58.942667 systemd-logind[1652]: Removed session 6. Oct 28 00:14:58.987558 sshd[1950]: Accepted publickey for core from 139.178.68.195 port 58762 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:14:58.988455 sshd-session[1950]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:14:58.991429 systemd-logind[1652]: New session 7 of user core. Oct 28 00:14:59.009677 systemd[1]: Started session-7.scope - Session 7 of User core. Oct 28 00:14:59.122350 sudo[1954]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Oct 28 00:14:59.122527 sudo[1954]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:14:59.135755 sudo[1954]: pam_unix(sudo:session): session closed for user root Oct 28 00:14:59.136741 sshd[1953]: Connection closed by 139.178.68.195 port 58762 Oct 28 00:14:59.137582 sshd-session[1950]: pam_unix(sshd:session): session closed for user core Oct 28 00:14:59.145902 systemd[1]: sshd@4-139.178.70.103:22-139.178.68.195:58762.service: Deactivated successfully. Oct 28 00:14:59.147047 systemd[1]: session-7.scope: Deactivated successfully. Oct 28 00:14:59.147614 systemd-logind[1652]: Session 7 logged out. Waiting for processes to exit. Oct 28 00:14:59.149331 systemd[1]: Started sshd@5-139.178.70.103:22-139.178.68.195:58764.service - OpenSSH per-connection server daemon (139.178.68.195:58764). Oct 28 00:14:59.149947 systemd-logind[1652]: Removed session 7. Oct 28 00:14:59.183166 sshd[1960]: Accepted publickey for core from 139.178.68.195 port 58764 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:14:59.183886 sshd-session[1960]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:14:59.186411 systemd-logind[1652]: New session 8 of user core. Oct 28 00:14:59.197642 systemd[1]: Started session-8.scope - Session 8 of User core. Oct 28 00:14:59.245596 sudo[1965]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Oct 28 00:14:59.245748 sudo[1965]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:14:59.251624 sudo[1965]: pam_unix(sudo:session): session closed for user root Oct 28 00:14:59.255621 sudo[1964]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Oct 28 00:14:59.255774 sudo[1964]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:14:59.261820 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 28 00:14:59.286343 augenrules[1987]: No rules Oct 28 00:14:59.286649 systemd[1]: audit-rules.service: Deactivated successfully. Oct 28 00:14:59.286797 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 28 00:14:59.287642 sudo[1964]: pam_unix(sudo:session): session closed for user root Oct 28 00:14:59.288520 sshd[1963]: Connection closed by 139.178.68.195 port 58764 Oct 28 00:14:59.289160 sshd-session[1960]: pam_unix(sshd:session): session closed for user core Oct 28 00:14:59.293708 systemd[1]: Started sshd@6-139.178.70.103:22-139.178.68.195:58766.service - OpenSSH per-connection server daemon (139.178.68.195:58766). Oct 28 00:14:59.293970 systemd[1]: sshd@5-139.178.70.103:22-139.178.68.195:58764.service: Deactivated successfully. Oct 28 00:14:59.295118 systemd[1]: session-8.scope: Deactivated successfully. Oct 28 00:14:59.295633 systemd-logind[1652]: Session 8 logged out. Waiting for processes to exit. Oct 28 00:14:59.297600 systemd-logind[1652]: Removed session 8. Oct 28 00:14:59.336628 sshd[1993]: Accepted publickey for core from 139.178.68.195 port 58766 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:14:59.337304 sshd-session[1993]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:14:59.339808 systemd-logind[1652]: New session 9 of user core. Oct 28 00:14:59.347700 systemd[1]: Started session-9.scope - Session 9 of User core. Oct 28 00:14:59.396621 sudo[2000]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Oct 28 00:14:59.396821 sudo[2000]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 28 00:15:00.078140 systemd[1]: Starting docker.service - Docker Application Container Engine... Oct 28 00:15:00.090717 (dockerd)[2017]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Oct 28 00:15:00.317302 dockerd[2017]: time="2025-10-28T00:15:00.317272521Z" level=info msg="Starting up" Oct 28 00:15:00.319804 dockerd[2017]: time="2025-10-28T00:15:00.319605456Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Oct 28 00:15:00.325177 dockerd[2017]: time="2025-10-28T00:15:00.325161586Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Oct 28 00:15:00.333199 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport842125918-merged.mount: Deactivated successfully. Oct 28 00:15:00.354195 dockerd[2017]: time="2025-10-28T00:15:00.354173918Z" level=info msg="Loading containers: start." Oct 28 00:15:00.367512 kernel: Initializing XFRM netlink socket Oct 28 00:15:00.530989 systemd-networkd[1481]: docker0: Link UP Oct 28 00:15:00.554265 dockerd[2017]: time="2025-10-28T00:15:00.554209664Z" level=info msg="Loading containers: done." Oct 28 00:15:00.563441 dockerd[2017]: time="2025-10-28T00:15:00.563235352Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Oct 28 00:15:00.563441 dockerd[2017]: time="2025-10-28T00:15:00.563292348Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Oct 28 00:15:00.563441 dockerd[2017]: time="2025-10-28T00:15:00.563336668Z" level=info msg="Initializing buildkit" Oct 28 00:15:00.573007 dockerd[2017]: time="2025-10-28T00:15:00.572994833Z" level=info msg="Completed buildkit initialization" Oct 28 00:15:00.577719 dockerd[2017]: time="2025-10-28T00:15:00.577703383Z" level=info msg="Daemon has completed initialization" Oct 28 00:15:00.577826 dockerd[2017]: time="2025-10-28T00:15:00.577798500Z" level=info msg="API listen on /run/docker.sock" Oct 28 00:15:00.578447 systemd[1]: Started docker.service - Docker Application Container Engine. Oct 28 00:15:01.112632 containerd[1690]: time="2025-10-28T00:15:01.112599693Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.1\"" Oct 28 00:15:01.666773 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3587490658.mount: Deactivated successfully. Oct 28 00:15:01.790255 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Oct 28 00:15:01.791505 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:15:02.062877 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:15:02.068749 (kubelet)[2288]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:15:02.143971 kubelet[2288]: E1028 00:15:02.143938 2288 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:15:02.145355 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:15:02.145504 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:15:02.145867 systemd[1]: kubelet.service: Consumed 101ms CPU time, 110.1M memory peak. Oct 28 00:15:02.735633 containerd[1690]: time="2025-10-28T00:15:02.735086924Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:02.735633 containerd[1690]: time="2025-10-28T00:15:02.735513001Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.1: active requests=0, bytes read=27065392" Oct 28 00:15:02.735633 containerd[1690]: time="2025-10-28T00:15:02.735607590Z" level=info msg="ImageCreate event name:\"sha256:c3994bc6961024917ec0aeee02e62828108c21a52d87648e30f3080d9cbadc97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:02.736977 containerd[1690]: time="2025-10-28T00:15:02.736964882Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:b9d7c117f8ac52bed4b13aeed973dc5198f9d93a926e6fe9e0b384f155baa902\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:02.737509 containerd[1690]: time="2025-10-28T00:15:02.737484373Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.1\" with image id \"sha256:c3994bc6961024917ec0aeee02e62828108c21a52d87648e30f3080d9cbadc97\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:b9d7c117f8ac52bed4b13aeed973dc5198f9d93a926e6fe9e0b384f155baa902\", size \"27061991\" in 1.624858241s" Oct 28 00:15:02.737539 containerd[1690]: time="2025-10-28T00:15:02.737514376Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.1\" returns image reference \"sha256:c3994bc6961024917ec0aeee02e62828108c21a52d87648e30f3080d9cbadc97\"" Oct 28 00:15:02.738172 containerd[1690]: time="2025-10-28T00:15:02.738126442Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.1\"" Oct 28 00:15:04.011510 containerd[1690]: time="2025-10-28T00:15:04.011172357Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:04.018124 containerd[1690]: time="2025-10-28T00:15:04.018107942Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.1: active requests=0, bytes read=21159757" Oct 28 00:15:04.023279 containerd[1690]: time="2025-10-28T00:15:04.023250497Z" level=info msg="ImageCreate event name:\"sha256:c80c8dbafe7dd71fc21527912a6dd20ccd1b71f3e561a5c28337388d0619538f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:04.031508 containerd[1690]: time="2025-10-28T00:15:04.031109801Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:2bf47c1b01f51e8963bf2327390883c9fa4ed03ea1b284500a2cba17ce303e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:04.031556 containerd[1690]: time="2025-10-28T00:15:04.031535037Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.1\" with image id \"sha256:c80c8dbafe7dd71fc21527912a6dd20ccd1b71f3e561a5c28337388d0619538f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:2bf47c1b01f51e8963bf2327390883c9fa4ed03ea1b284500a2cba17ce303e89\", size \"22820214\" in 1.293317455s" Oct 28 00:15:04.031593 containerd[1690]: time="2025-10-28T00:15:04.031560083Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.1\" returns image reference \"sha256:c80c8dbafe7dd71fc21527912a6dd20ccd1b71f3e561a5c28337388d0619538f\"" Oct 28 00:15:04.031889 containerd[1690]: time="2025-10-28T00:15:04.031828075Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.1\"" Oct 28 00:15:05.159813 containerd[1690]: time="2025-10-28T00:15:05.159775022Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:05.160271 containerd[1690]: time="2025-10-28T00:15:05.160240589Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.1: active requests=0, bytes read=15725093" Oct 28 00:15:05.161430 containerd[1690]: time="2025-10-28T00:15:05.160706684Z" level=info msg="ImageCreate event name:\"sha256:7dd6aaa1717ab7eaae4578503e4c4d9965fcf5a249e8155fe16379ee9b6cb813\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:05.162551 containerd[1690]: time="2025-10-28T00:15:05.162534775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:6e9fbc4e25a576483e6a233976353a66e4d77eb5d0530e9118e94b7d46fb3500\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:05.163675 containerd[1690]: time="2025-10-28T00:15:05.163658829Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.1\" with image id \"sha256:7dd6aaa1717ab7eaae4578503e4c4d9965fcf5a249e8155fe16379ee9b6cb813\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:6e9fbc4e25a576483e6a233976353a66e4d77eb5d0530e9118e94b7d46fb3500\", size \"17385568\" in 1.131717688s" Oct 28 00:15:05.163735 containerd[1690]: time="2025-10-28T00:15:05.163726964Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.1\" returns image reference \"sha256:7dd6aaa1717ab7eaae4578503e4c4d9965fcf5a249e8155fe16379ee9b6cb813\"" Oct 28 00:15:05.164031 containerd[1690]: time="2025-10-28T00:15:05.164009191Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.1\"" Oct 28 00:15:06.342748 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2001633584.mount: Deactivated successfully. Oct 28 00:15:06.566609 containerd[1690]: time="2025-10-28T00:15:06.566574518Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:06.573813 containerd[1690]: time="2025-10-28T00:15:06.573794702Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.1: active requests=0, bytes read=25964699" Oct 28 00:15:06.584461 containerd[1690]: time="2025-10-28T00:15:06.584434447Z" level=info msg="ImageCreate event name:\"sha256:fc25172553d79197ecd840ec8dba1fba68330079355e974b04c1a441e6a4a0b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:06.591052 containerd[1690]: time="2025-10-28T00:15:06.591034113Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:913cc83ca0b5588a81d86ce8eedeb3ed1e9c1326e81852a1ea4f622b74ff749a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:06.591618 containerd[1690]: time="2025-10-28T00:15:06.591600942Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.1\" with image id \"sha256:fc25172553d79197ecd840ec8dba1fba68330079355e974b04c1a441e6a4a0b7\", repo tag \"registry.k8s.io/kube-proxy:v1.34.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:913cc83ca0b5588a81d86ce8eedeb3ed1e9c1326e81852a1ea4f622b74ff749a\", size \"25963718\" in 1.427576218s" Oct 28 00:15:06.591656 containerd[1690]: time="2025-10-28T00:15:06.591621156Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.1\" returns image reference \"sha256:fc25172553d79197ecd840ec8dba1fba68330079355e974b04c1a441e6a4a0b7\"" Oct 28 00:15:06.591887 containerd[1690]: time="2025-10-28T00:15:06.591872967Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Oct 28 00:15:07.226463 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3128257300.mount: Deactivated successfully. Oct 28 00:15:08.935949 containerd[1690]: time="2025-10-28T00:15:08.935910850Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:08.940152 containerd[1690]: time="2025-10-28T00:15:08.940124220Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388007" Oct 28 00:15:08.947927 containerd[1690]: time="2025-10-28T00:15:08.947886340Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:08.952504 containerd[1690]: time="2025-10-28T00:15:08.951607577Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:08.952504 containerd[1690]: time="2025-10-28T00:15:08.952335817Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 2.360443513s" Oct 28 00:15:08.952504 containerd[1690]: time="2025-10-28T00:15:08.952354502Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Oct 28 00:15:08.952952 containerd[1690]: time="2025-10-28T00:15:08.952938985Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Oct 28 00:15:09.723036 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1135899968.mount: Deactivated successfully. Oct 28 00:15:09.731914 containerd[1690]: time="2025-10-28T00:15:09.731868799Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:09.732267 containerd[1690]: time="2025-10-28T00:15:09.732254079Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321218" Oct 28 00:15:09.733207 containerd[1690]: time="2025-10-28T00:15:09.732486001Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:09.733535 containerd[1690]: time="2025-10-28T00:15:09.733521283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:09.734015 containerd[1690]: time="2025-10-28T00:15:09.733994739Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 780.986809ms" Oct 28 00:15:09.734068 containerd[1690]: time="2025-10-28T00:15:09.734059633Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Oct 28 00:15:09.734461 containerd[1690]: time="2025-10-28T00:15:09.734446753Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Oct 28 00:15:12.290488 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Oct 28 00:15:12.292386 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:15:12.795589 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:15:12.805728 (kubelet)[2426]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 28 00:15:12.821767 containerd[1690]: time="2025-10-28T00:15:12.821721846Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:12.836558 containerd[1690]: time="2025-10-28T00:15:12.836538551Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=73514593" Oct 28 00:15:12.844220 containerd[1690]: time="2025-10-28T00:15:12.844191372Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:12.849506 containerd[1690]: time="2025-10-28T00:15:12.849072967Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:12.849563 containerd[1690]: time="2025-10-28T00:15:12.849540816Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 3.115076311s" Oct 28 00:15:12.849603 containerd[1690]: time="2025-10-28T00:15:12.849569352Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Oct 28 00:15:12.902883 kubelet[2426]: E1028 00:15:12.902851 2426 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 28 00:15:12.904141 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 28 00:15:12.904237 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 28 00:15:12.904590 systemd[1]: kubelet.service: Consumed 116ms CPU time, 111.8M memory peak. Oct 28 00:15:13.890565 update_engine[1653]: I20251028 00:15:13.890514 1653 update_attempter.cc:509] Updating boot flags... Oct 28 00:15:15.441486 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:15:15.441814 systemd[1]: kubelet.service: Consumed 116ms CPU time, 111.8M memory peak. Oct 28 00:15:15.443535 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:15:15.466562 systemd[1]: Reload requested from client PID 2477 ('systemctl') (unit session-9.scope)... Oct 28 00:15:15.466644 systemd[1]: Reloading... Oct 28 00:15:15.524508 zram_generator::config[2525]: No configuration found. Oct 28 00:15:15.599841 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:15:15.672638 systemd[1]: Reloading finished in 205 ms. Oct 28 00:15:15.757843 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Oct 28 00:15:15.757919 systemd[1]: kubelet.service: Failed with result 'signal'. Oct 28 00:15:15.758349 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:15:15.760246 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:15:16.069596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:15:16.074847 (kubelet)[2589]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 28 00:15:16.134290 kubelet[2589]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 28 00:15:16.134290 kubelet[2589]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 28 00:15:16.161944 kubelet[2589]: I1028 00:15:16.161897 2589 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 28 00:15:16.564811 kubelet[2589]: I1028 00:15:16.564788 2589 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Oct 28 00:15:16.564811 kubelet[2589]: I1028 00:15:16.564806 2589 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 28 00:15:16.566122 kubelet[2589]: I1028 00:15:16.566111 2589 watchdog_linux.go:95] "Systemd watchdog is not enabled" Oct 28 00:15:16.566122 kubelet[2589]: I1028 00:15:16.566121 2589 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 28 00:15:16.566269 kubelet[2589]: I1028 00:15:16.566258 2589 server.go:956] "Client rotation is on, will bootstrap in background" Oct 28 00:15:16.580050 kubelet[2589]: I1028 00:15:16.579970 2589 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 28 00:15:16.585972 kubelet[2589]: E1028 00:15:16.585133 2589 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://139.178.70.103:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Oct 28 00:15:16.591221 kubelet[2589]: I1028 00:15:16.591206 2589 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 28 00:15:16.596562 kubelet[2589]: I1028 00:15:16.596544 2589 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Oct 28 00:15:16.597277 kubelet[2589]: I1028 00:15:16.597257 2589 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 28 00:15:16.598434 kubelet[2589]: I1028 00:15:16.597316 2589 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 28 00:15:16.598561 kubelet[2589]: I1028 00:15:16.598555 2589 topology_manager.go:138] "Creating topology manager with none policy" Oct 28 00:15:16.598628 kubelet[2589]: I1028 00:15:16.598622 2589 container_manager_linux.go:306] "Creating device plugin manager" Oct 28 00:15:16.598721 kubelet[2589]: I1028 00:15:16.598715 2589 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Oct 28 00:15:16.600431 kubelet[2589]: I1028 00:15:16.600423 2589 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:15:16.600633 kubelet[2589]: I1028 00:15:16.600627 2589 kubelet.go:475] "Attempting to sync node with API server" Oct 28 00:15:16.600669 kubelet[2589]: I1028 00:15:16.600665 2589 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 28 00:15:16.601038 kubelet[2589]: E1028 00:15:16.601000 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 28 00:15:16.601078 kubelet[2589]: I1028 00:15:16.601072 2589 kubelet.go:387] "Adding apiserver pod source" Oct 28 00:15:16.601128 kubelet[2589]: I1028 00:15:16.601121 2589 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 28 00:15:16.603688 kubelet[2589]: E1028 00:15:16.603674 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Oct 28 00:15:16.603943 kubelet[2589]: I1028 00:15:16.603934 2589 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 28 00:15:16.606757 kubelet[2589]: I1028 00:15:16.606726 2589 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 28 00:15:16.606832 kubelet[2589]: I1028 00:15:16.606826 2589 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Oct 28 00:15:16.609413 kubelet[2589]: W1028 00:15:16.609403 2589 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Oct 28 00:15:16.614540 kubelet[2589]: I1028 00:15:16.614524 2589 server.go:1262] "Started kubelet" Oct 28 00:15:16.616687 kubelet[2589]: I1028 00:15:16.616672 2589 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 28 00:15:16.620907 kubelet[2589]: I1028 00:15:16.620825 2589 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 28 00:15:16.624606 kubelet[2589]: I1028 00:15:16.624578 2589 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 28 00:15:16.626859 kubelet[2589]: I1028 00:15:16.626599 2589 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 28 00:15:16.626859 kubelet[2589]: E1028 00:15:16.626713 2589 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:15:16.632453 kubelet[2589]: I1028 00:15:16.632430 2589 volume_manager.go:313] "Starting Kubelet Volume Manager" Oct 28 00:15:16.633235 kubelet[2589]: I1028 00:15:16.633220 2589 reconciler.go:29] "Reconciler: start to sync state" Oct 28 00:15:16.633376 kubelet[2589]: I1028 00:15:16.633363 2589 factory.go:223] Registration of the systemd container factory successfully Oct 28 00:15:16.633420 kubelet[2589]: I1028 00:15:16.633407 2589 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 28 00:15:16.635248 kubelet[2589]: E1028 00:15:16.634130 2589 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.103:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.103:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18727f6e736601a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-10-28 00:15:16.614480288 +0000 UTC m=+0.536364885,LastTimestamp:2025-10-28 00:15:16.614480288 +0000 UTC m=+0.536364885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Oct 28 00:15:16.636504 kubelet[2589]: E1028 00:15:16.635887 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 28 00:15:16.636504 kubelet[2589]: E1028 00:15:16.635955 2589 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="200ms" Oct 28 00:15:16.636504 kubelet[2589]: I1028 00:15:16.636150 2589 server.go:310] "Adding debug handlers to kubelet server" Oct 28 00:15:16.638818 kubelet[2589]: I1028 00:15:16.638788 2589 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 28 00:15:16.638943 kubelet[2589]: I1028 00:15:16.638935 2589 server_v1.go:49] "podresources" method="list" useActivePods=true Oct 28 00:15:16.639106 kubelet[2589]: I1028 00:15:16.639098 2589 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 28 00:15:16.643004 kubelet[2589]: I1028 00:15:16.642988 2589 factory.go:223] Registration of the containerd container factory successfully Oct 28 00:15:16.648328 kubelet[2589]: I1028 00:15:16.648306 2589 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Oct 28 00:15:16.649730 kubelet[2589]: I1028 00:15:16.649718 2589 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Oct 28 00:15:16.649796 kubelet[2589]: I1028 00:15:16.649791 2589 status_manager.go:244] "Starting to sync pod status with apiserver" Oct 28 00:15:16.649844 kubelet[2589]: I1028 00:15:16.649840 2589 kubelet.go:2427] "Starting kubelet main sync loop" Oct 28 00:15:16.649904 kubelet[2589]: E1028 00:15:16.649894 2589 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 28 00:15:16.654255 kubelet[2589]: E1028 00:15:16.654238 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Oct 28 00:15:16.666079 kubelet[2589]: I1028 00:15:16.666060 2589 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 28 00:15:16.666644 kubelet[2589]: I1028 00:15:16.666469 2589 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 28 00:15:16.666644 kubelet[2589]: I1028 00:15:16.666481 2589 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:15:16.667168 kubelet[2589]: I1028 00:15:16.667161 2589 policy_none.go:49] "None policy: Start" Oct 28 00:15:16.667215 kubelet[2589]: I1028 00:15:16.667210 2589 memory_manager.go:187] "Starting memorymanager" policy="None" Oct 28 00:15:16.667253 kubelet[2589]: I1028 00:15:16.667248 2589 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Oct 28 00:15:16.667698 kubelet[2589]: I1028 00:15:16.667691 2589 policy_none.go:47] "Start" Oct 28 00:15:16.670310 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Oct 28 00:15:16.679349 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Oct 28 00:15:16.682023 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Oct 28 00:15:16.690322 kubelet[2589]: E1028 00:15:16.690305 2589 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 28 00:15:16.690724 kubelet[2589]: I1028 00:15:16.690716 2589 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 28 00:15:16.690797 kubelet[2589]: I1028 00:15:16.690777 2589 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 28 00:15:16.690968 kubelet[2589]: I1028 00:15:16.690962 2589 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 28 00:15:16.691586 kubelet[2589]: E1028 00:15:16.691576 2589 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 28 00:15:16.691684 kubelet[2589]: E1028 00:15:16.691676 2589 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Oct 28 00:15:16.774725 systemd[1]: Created slice kubepods-burstable-pod72ae43bf624d285361487631af8a6ba6.slice - libcontainer container kubepods-burstable-pod72ae43bf624d285361487631af8a6ba6.slice. Oct 28 00:15:16.787433 kubelet[2589]: E1028 00:15:16.787412 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:16.790148 systemd[1]: Created slice kubepods-burstable-podce161b3b11c90b0b844f2e4f86b4e8cd.slice - libcontainer container kubepods-burstable-podce161b3b11c90b0b844f2e4f86b4e8cd.slice. Oct 28 00:15:16.791813 kubelet[2589]: I1028 00:15:16.791639 2589 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:15:16.791882 kubelet[2589]: E1028 00:15:16.791846 2589 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Oct 28 00:15:16.800523 kubelet[2589]: E1028 00:15:16.800510 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:16.802829 systemd[1]: Created slice kubepods-burstable-pod0aec103f82290345bcc057479e5c6dde.slice - libcontainer container kubepods-burstable-pod0aec103f82290345bcc057479e5c6dde.slice. Oct 28 00:15:16.804071 kubelet[2589]: E1028 00:15:16.804059 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:16.834106 kubelet[2589]: I1028 00:15:16.833388 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:16.834106 kubelet[2589]: I1028 00:15:16.833411 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:16.834106 kubelet[2589]: I1028 00:15:16.833425 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72ae43bf624d285361487631af8a6ba6-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72ae43bf624d285361487631af8a6ba6\") " pod="kube-system/kube-scheduler-localhost" Oct 28 00:15:16.834106 kubelet[2589]: I1028 00:15:16.833434 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0aec103f82290345bcc057479e5c6dde-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0aec103f82290345bcc057479e5c6dde\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:16.834106 kubelet[2589]: I1028 00:15:16.833444 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:16.834231 kubelet[2589]: I1028 00:15:16.833451 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:16.834231 kubelet[2589]: I1028 00:15:16.833460 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:16.834231 kubelet[2589]: I1028 00:15:16.833467 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0aec103f82290345bcc057479e5c6dde-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0aec103f82290345bcc057479e5c6dde\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:16.834231 kubelet[2589]: I1028 00:15:16.833477 2589 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0aec103f82290345bcc057479e5c6dde-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0aec103f82290345bcc057479e5c6dde\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:16.836707 kubelet[2589]: E1028 00:15:16.836689 2589 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="400ms" Oct 28 00:15:16.993628 kubelet[2589]: I1028 00:15:16.993609 2589 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:15:16.993827 kubelet[2589]: E1028 00:15:16.993810 2589 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Oct 28 00:15:17.089889 containerd[1690]: time="2025-10-28T00:15:17.089823324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72ae43bf624d285361487631af8a6ba6,Namespace:kube-system,Attempt:0,}" Oct 28 00:15:17.101933 containerd[1690]: time="2025-10-28T00:15:17.101904012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:ce161b3b11c90b0b844f2e4f86b4e8cd,Namespace:kube-system,Attempt:0,}" Oct 28 00:15:17.106121 containerd[1690]: time="2025-10-28T00:15:17.106098193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0aec103f82290345bcc057479e5c6dde,Namespace:kube-system,Attempt:0,}" Oct 28 00:15:17.237848 kubelet[2589]: E1028 00:15:17.237820 2589 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="800ms" Oct 28 00:15:17.394880 kubelet[2589]: I1028 00:15:17.394813 2589 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:15:17.395154 kubelet[2589]: E1028 00:15:17.395135 2589 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Oct 28 00:15:17.773021 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount762848598.mount: Deactivated successfully. Oct 28 00:15:17.775727 containerd[1690]: time="2025-10-28T00:15:17.775705659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:15:17.776477 containerd[1690]: time="2025-10-28T00:15:17.776417089Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Oct 28 00:15:17.777118 containerd[1690]: time="2025-10-28T00:15:17.777101668Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:15:17.777825 containerd[1690]: time="2025-10-28T00:15:17.777808094Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Oct 28 00:15:17.778184 containerd[1690]: time="2025-10-28T00:15:17.778167443Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:15:17.779251 containerd[1690]: time="2025-10-28T00:15:17.779234480Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Oct 28 00:15:17.779543 containerd[1690]: time="2025-10-28T00:15:17.779528027Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:15:17.781430 containerd[1690]: time="2025-10-28T00:15:17.781412983Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 28 00:15:17.782980 containerd[1690]: time="2025-10-28T00:15:17.782562692Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 675.836344ms" Oct 28 00:15:17.782980 containerd[1690]: time="2025-10-28T00:15:17.782584870Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 680.014842ms" Oct 28 00:15:17.782980 containerd[1690]: time="2025-10-28T00:15:17.782914696Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 692.072528ms" Oct 28 00:15:17.854146 containerd[1690]: time="2025-10-28T00:15:17.854113776Z" level=info msg="connecting to shim 221db60ce9fa9fbddd1e5ae614dfa38067548a838829a38ddb14a75a4cfa9b25" address="unix:///run/containerd/s/8f8d8e050dcf7bcd604c56b45e6833e8b46591d345cd6f1c4881459a1bf8dfe4" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:15:17.855968 containerd[1690]: time="2025-10-28T00:15:17.855942599Z" level=info msg="connecting to shim 18a46522431824445ad4da3874a64b310c528cb4d05171652ce91e9243d8f2d2" address="unix:///run/containerd/s/abf3bbd63148e1e424dec54316833be21b5e40e98ac914d0ed3abcfc350e8e58" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:15:17.857347 containerd[1690]: time="2025-10-28T00:15:17.857331420Z" level=info msg="connecting to shim a606fecd4c81d2dd049f8f1c6eca893de44b8ece520e34b8d0ef0a54a35383b9" address="unix:///run/containerd/s/0cddacf3bb365da5bebb85d0f3706928cca74440848978ec0a903b351e413692" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:15:17.893820 kubelet[2589]: E1028 00:15:17.893180 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.103:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 28 00:15:17.897231 kubelet[2589]: E1028 00:15:17.897046 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.103:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 28 00:15:17.899975 kubelet[2589]: E1028 00:15:17.899381 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.103:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Oct 28 00:15:17.928634 systemd[1]: Started cri-containerd-221db60ce9fa9fbddd1e5ae614dfa38067548a838829a38ddb14a75a4cfa9b25.scope - libcontainer container 221db60ce9fa9fbddd1e5ae614dfa38067548a838829a38ddb14a75a4cfa9b25. Oct 28 00:15:17.932778 systemd[1]: Started cri-containerd-18a46522431824445ad4da3874a64b310c528cb4d05171652ce91e9243d8f2d2.scope - libcontainer container 18a46522431824445ad4da3874a64b310c528cb4d05171652ce91e9243d8f2d2. Oct 28 00:15:17.934779 systemd[1]: Started cri-containerd-a606fecd4c81d2dd049f8f1c6eca893de44b8ece520e34b8d0ef0a54a35383b9.scope - libcontainer container a606fecd4c81d2dd049f8f1c6eca893de44b8ece520e34b8d0ef0a54a35383b9. Oct 28 00:15:18.002040 containerd[1690]: time="2025-10-28T00:15:18.001552714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:ce161b3b11c90b0b844f2e4f86b4e8cd,Namespace:kube-system,Attempt:0,} returns sandbox id \"a606fecd4c81d2dd049f8f1c6eca893de44b8ece520e34b8d0ef0a54a35383b9\"" Oct 28 00:15:18.010813 containerd[1690]: time="2025-10-28T00:15:18.010787822Z" level=info msg="CreateContainer within sandbox \"a606fecd4c81d2dd049f8f1c6eca893de44b8ece520e34b8d0ef0a54a35383b9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Oct 28 00:15:18.021806 containerd[1690]: time="2025-10-28T00:15:18.021774105Z" level=info msg="Container 9e6b76af8ce9be7ed105aa1da8f9b3cb0166ce0a609cbc2a05d7f21f919540a4: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:18.028486 containerd[1690]: time="2025-10-28T00:15:18.028357851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0aec103f82290345bcc057479e5c6dde,Namespace:kube-system,Attempt:0,} returns sandbox id \"18a46522431824445ad4da3874a64b310c528cb4d05171652ce91e9243d8f2d2\"" Oct 28 00:15:18.028763 containerd[1690]: time="2025-10-28T00:15:18.028744245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72ae43bf624d285361487631af8a6ba6,Namespace:kube-system,Attempt:0,} returns sandbox id \"221db60ce9fa9fbddd1e5ae614dfa38067548a838829a38ddb14a75a4cfa9b25\"" Oct 28 00:15:18.034521 containerd[1690]: time="2025-10-28T00:15:18.034476386Z" level=info msg="CreateContainer within sandbox \"18a46522431824445ad4da3874a64b310c528cb4d05171652ce91e9243d8f2d2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Oct 28 00:15:18.035971 containerd[1690]: time="2025-10-28T00:15:18.035952824Z" level=info msg="CreateContainer within sandbox \"a606fecd4c81d2dd049f8f1c6eca893de44b8ece520e34b8d0ef0a54a35383b9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9e6b76af8ce9be7ed105aa1da8f9b3cb0166ce0a609cbc2a05d7f21f919540a4\"" Oct 28 00:15:18.036095 containerd[1690]: time="2025-10-28T00:15:18.036080786Z" level=info msg="CreateContainer within sandbox \"221db60ce9fa9fbddd1e5ae614dfa38067548a838829a38ddb14a75a4cfa9b25\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Oct 28 00:15:18.038838 kubelet[2589]: E1028 00:15:18.038808 2589 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.103:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.103:6443: connect: connection refused" interval="1.6s" Oct 28 00:15:18.039420 containerd[1690]: time="2025-10-28T00:15:18.039399978Z" level=info msg="Container 3cc6fbc2dd26f5dc44edcbe12517567f9f9e1ec11cfbad0fab595e4607a98c40: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:18.043790 containerd[1690]: time="2025-10-28T00:15:18.043767550Z" level=info msg="StartContainer for \"9e6b76af8ce9be7ed105aa1da8f9b3cb0166ce0a609cbc2a05d7f21f919540a4\"" Oct 28 00:15:18.045800 containerd[1690]: time="2025-10-28T00:15:18.045780830Z" level=info msg="connecting to shim 9e6b76af8ce9be7ed105aa1da8f9b3cb0166ce0a609cbc2a05d7f21f919540a4" address="unix:///run/containerd/s/0cddacf3bb365da5bebb85d0f3706928cca74440848978ec0a903b351e413692" protocol=ttrpc version=3 Oct 28 00:15:18.047344 containerd[1690]: time="2025-10-28T00:15:18.047312194Z" level=info msg="Container b68ae9b50b48f4470750d70bd2b0b2acf1acffd3c397a233b031e6ff2efb6bf8: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:18.050194 containerd[1690]: time="2025-10-28T00:15:18.050172797Z" level=info msg="CreateContainer within sandbox \"18a46522431824445ad4da3874a64b310c528cb4d05171652ce91e9243d8f2d2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3cc6fbc2dd26f5dc44edcbe12517567f9f9e1ec11cfbad0fab595e4607a98c40\"" Oct 28 00:15:18.050712 containerd[1690]: time="2025-10-28T00:15:18.050697775Z" level=info msg="StartContainer for \"3cc6fbc2dd26f5dc44edcbe12517567f9f9e1ec11cfbad0fab595e4607a98c40\"" Oct 28 00:15:18.051274 containerd[1690]: time="2025-10-28T00:15:18.051242091Z" level=info msg="connecting to shim 3cc6fbc2dd26f5dc44edcbe12517567f9f9e1ec11cfbad0fab595e4607a98c40" address="unix:///run/containerd/s/abf3bbd63148e1e424dec54316833be21b5e40e98ac914d0ed3abcfc350e8e58" protocol=ttrpc version=3 Oct 28 00:15:18.051765 containerd[1690]: time="2025-10-28T00:15:18.051602098Z" level=info msg="CreateContainer within sandbox \"221db60ce9fa9fbddd1e5ae614dfa38067548a838829a38ddb14a75a4cfa9b25\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b68ae9b50b48f4470750d70bd2b0b2acf1acffd3c397a233b031e6ff2efb6bf8\"" Oct 28 00:15:18.052232 containerd[1690]: time="2025-10-28T00:15:18.052217697Z" level=info msg="StartContainer for \"b68ae9b50b48f4470750d70bd2b0b2acf1acffd3c397a233b031e6ff2efb6bf8\"" Oct 28 00:15:18.052894 containerd[1690]: time="2025-10-28T00:15:18.052871994Z" level=info msg="connecting to shim b68ae9b50b48f4470750d70bd2b0b2acf1acffd3c397a233b031e6ff2efb6bf8" address="unix:///run/containerd/s/8f8d8e050dcf7bcd604c56b45e6833e8b46591d345cd6f1c4881459a1bf8dfe4" protocol=ttrpc version=3 Oct 28 00:15:18.066597 systemd[1]: Started cri-containerd-9e6b76af8ce9be7ed105aa1da8f9b3cb0166ce0a609cbc2a05d7f21f919540a4.scope - libcontainer container 9e6b76af8ce9be7ed105aa1da8f9b3cb0166ce0a609cbc2a05d7f21f919540a4. Oct 28 00:15:18.072613 systemd[1]: Started cri-containerd-3cc6fbc2dd26f5dc44edcbe12517567f9f9e1ec11cfbad0fab595e4607a98c40.scope - libcontainer container 3cc6fbc2dd26f5dc44edcbe12517567f9f9e1ec11cfbad0fab595e4607a98c40. Oct 28 00:15:18.074911 systemd[1]: Started cri-containerd-b68ae9b50b48f4470750d70bd2b0b2acf1acffd3c397a233b031e6ff2efb6bf8.scope - libcontainer container b68ae9b50b48f4470750d70bd2b0b2acf1acffd3c397a233b031e6ff2efb6bf8. Oct 28 00:15:18.121399 containerd[1690]: time="2025-10-28T00:15:18.121311049Z" level=info msg="StartContainer for \"9e6b76af8ce9be7ed105aa1da8f9b3cb0166ce0a609cbc2a05d7f21f919540a4\" returns successfully" Oct 28 00:15:18.129647 containerd[1690]: time="2025-10-28T00:15:18.129611571Z" level=info msg="StartContainer for \"3cc6fbc2dd26f5dc44edcbe12517567f9f9e1ec11cfbad0fab595e4607a98c40\" returns successfully" Oct 28 00:15:18.148016 containerd[1690]: time="2025-10-28T00:15:18.147993273Z" level=info msg="StartContainer for \"b68ae9b50b48f4470750d70bd2b0b2acf1acffd3c397a233b031e6ff2efb6bf8\" returns successfully" Oct 28 00:15:18.193726 kubelet[2589]: E1028 00:15:18.193701 2589 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.103:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.103:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Oct 28 00:15:18.196736 kubelet[2589]: I1028 00:15:18.196722 2589 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:15:18.196892 kubelet[2589]: E1028 00:15:18.196877 2589 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.103:6443/api/v1/nodes\": dial tcp 139.178.70.103:6443: connect: connection refused" node="localhost" Oct 28 00:15:18.670577 kubelet[2589]: E1028 00:15:18.670557 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:18.675263 kubelet[2589]: E1028 00:15:18.675245 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:18.676541 kubelet[2589]: E1028 00:15:18.676528 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:19.657643 kubelet[2589]: E1028 00:15:19.657612 2589 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Oct 28 00:15:19.679859 kubelet[2589]: E1028 00:15:19.679840 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:19.680066 kubelet[2589]: E1028 00:15:19.680031 2589 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 28 00:15:19.798978 kubelet[2589]: I1028 00:15:19.798951 2589 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:15:19.809885 kubelet[2589]: I1028 00:15:19.809860 2589 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Oct 28 00:15:19.809885 kubelet[2589]: E1028 00:15:19.809885 2589 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Oct 28 00:15:19.819276 kubelet[2589]: E1028 00:15:19.819253 2589 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:15:19.920401 kubelet[2589]: E1028 00:15:19.920318 2589 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 28 00:15:19.991732 kubelet[2589]: I1028 00:15:19.991711 2589 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:19.996703 kubelet[2589]: E1028 00:15:19.996409 2589 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:20.033012 kubelet[2589]: I1028 00:15:20.032985 2589 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:20.034274 kubelet[2589]: E1028 00:15:20.034257 2589 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:20.034351 kubelet[2589]: I1028 00:15:20.034281 2589 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 28 00:15:20.035284 kubelet[2589]: E1028 00:15:20.035264 2589 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Oct 28 00:15:20.035284 kubelet[2589]: I1028 00:15:20.035276 2589 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:20.036125 kubelet[2589]: E1028 00:15:20.036111 2589 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:20.606451 kubelet[2589]: I1028 00:15:20.606159 2589 apiserver.go:52] "Watching apiserver" Oct 28 00:15:20.626930 kubelet[2589]: I1028 00:15:20.626880 2589 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 28 00:15:20.679257 kubelet[2589]: I1028 00:15:20.679239 2589 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 28 00:15:21.370137 systemd[1]: Reload requested from client PID 2871 ('systemctl') (unit session-9.scope)... Oct 28 00:15:21.370304 systemd[1]: Reloading... Oct 28 00:15:21.425556 zram_generator::config[2915]: No configuration found. Oct 28 00:15:21.509359 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 28 00:15:21.587408 systemd[1]: Reloading finished in 216 ms. Oct 28 00:15:21.616881 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:15:21.638879 systemd[1]: kubelet.service: Deactivated successfully. Oct 28 00:15:21.639135 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:15:21.639213 systemd[1]: kubelet.service: Consumed 710ms CPU time, 123M memory peak. Oct 28 00:15:21.640884 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 28 00:15:22.183502 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 28 00:15:22.189812 (kubelet)[2983]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 28 00:15:22.281980 kubelet[2983]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 28 00:15:22.282189 kubelet[2983]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 28 00:15:22.282278 kubelet[2983]: I1028 00:15:22.282262 2983 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 28 00:15:22.289152 kubelet[2983]: I1028 00:15:22.289138 2983 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Oct 28 00:15:22.289221 kubelet[2983]: I1028 00:15:22.289215 2983 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 28 00:15:22.290554 kubelet[2983]: I1028 00:15:22.290546 2983 watchdog_linux.go:95] "Systemd watchdog is not enabled" Oct 28 00:15:22.290600 kubelet[2983]: I1028 00:15:22.290594 2983 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 28 00:15:22.290745 kubelet[2983]: I1028 00:15:22.290737 2983 server.go:956] "Client rotation is on, will bootstrap in background" Oct 28 00:15:22.292485 kubelet[2983]: I1028 00:15:22.292476 2983 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Oct 28 00:15:22.295297 kubelet[2983]: I1028 00:15:22.295279 2983 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 28 00:15:22.313871 kubelet[2983]: I1028 00:15:22.313858 2983 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 28 00:15:22.316415 kubelet[2983]: I1028 00:15:22.316397 2983 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Oct 28 00:15:22.318203 kubelet[2983]: I1028 00:15:22.318184 2983 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 28 00:15:22.318350 kubelet[2983]: I1028 00:15:22.318249 2983 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 28 00:15:22.318433 kubelet[2983]: I1028 00:15:22.318427 2983 topology_manager.go:138] "Creating topology manager with none policy" Oct 28 00:15:22.318467 kubelet[2983]: I1028 00:15:22.318463 2983 container_manager_linux.go:306] "Creating device plugin manager" Oct 28 00:15:22.318566 kubelet[2983]: I1028 00:15:22.318559 2983 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Oct 28 00:15:22.319381 kubelet[2983]: I1028 00:15:22.319373 2983 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:15:22.319559 kubelet[2983]: I1028 00:15:22.319552 2983 kubelet.go:475] "Attempting to sync node with API server" Oct 28 00:15:22.319600 kubelet[2983]: I1028 00:15:22.319595 2983 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 28 00:15:22.320380 kubelet[2983]: I1028 00:15:22.320372 2983 kubelet.go:387] "Adding apiserver pod source" Oct 28 00:15:22.320428 kubelet[2983]: I1028 00:15:22.320423 2983 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 28 00:15:22.327528 kubelet[2983]: I1028 00:15:22.326709 2983 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 28 00:15:22.327801 kubelet[2983]: I1028 00:15:22.327790 2983 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 28 00:15:22.327844 kubelet[2983]: I1028 00:15:22.327811 2983 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Oct 28 00:15:22.332190 kubelet[2983]: I1028 00:15:22.331112 2983 server.go:1262] "Started kubelet" Oct 28 00:15:22.344802 kubelet[2983]: I1028 00:15:22.344782 2983 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 28 00:15:22.351933 kubelet[2983]: I1028 00:15:22.351910 2983 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 28 00:15:22.352740 kubelet[2983]: I1028 00:15:22.352730 2983 server.go:310] "Adding debug handlers to kubelet server" Oct 28 00:15:22.356948 kubelet[2983]: I1028 00:15:22.356925 2983 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 28 00:15:22.357043 kubelet[2983]: I1028 00:15:22.357036 2983 server_v1.go:49] "podresources" method="list" useActivePods=true Oct 28 00:15:22.357170 kubelet[2983]: I1028 00:15:22.357158 2983 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 28 00:15:22.357390 kubelet[2983]: I1028 00:15:22.357382 2983 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 28 00:15:22.359460 kubelet[2983]: I1028 00:15:22.359451 2983 volume_manager.go:313] "Starting Kubelet Volume Manager" Oct 28 00:15:22.359573 kubelet[2983]: E1028 00:15:22.358889 2983 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 28 00:15:22.360265 kubelet[2983]: I1028 00:15:22.360102 2983 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 28 00:15:22.360879 kubelet[2983]: I1028 00:15:22.360792 2983 reconciler.go:29] "Reconciler: start to sync state" Oct 28 00:15:22.365171 kubelet[2983]: I1028 00:15:22.363874 2983 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Oct 28 00:15:22.365171 kubelet[2983]: I1028 00:15:22.363954 2983 factory.go:223] Registration of the systemd container factory successfully Oct 28 00:15:22.365171 kubelet[2983]: I1028 00:15:22.364011 2983 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 28 00:15:22.365171 kubelet[2983]: I1028 00:15:22.364637 2983 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Oct 28 00:15:22.365171 kubelet[2983]: I1028 00:15:22.364659 2983 status_manager.go:244] "Starting to sync pod status with apiserver" Oct 28 00:15:22.365171 kubelet[2983]: I1028 00:15:22.364672 2983 kubelet.go:2427] "Starting kubelet main sync loop" Oct 28 00:15:22.365171 kubelet[2983]: E1028 00:15:22.364693 2983 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 28 00:15:22.368078 kubelet[2983]: I1028 00:15:22.367660 2983 factory.go:223] Registration of the containerd container factory successfully Oct 28 00:15:22.405227 kubelet[2983]: I1028 00:15:22.405205 2983 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 28 00:15:22.405350 kubelet[2983]: I1028 00:15:22.405219 2983 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 28 00:15:22.405382 kubelet[2983]: I1028 00:15:22.405352 2983 state_mem.go:36] "Initialized new in-memory state store" Oct 28 00:15:22.405952 kubelet[2983]: I1028 00:15:22.405428 2983 state_mem.go:88] "Updated default CPUSet" cpuSet="" Oct 28 00:15:22.405952 kubelet[2983]: I1028 00:15:22.405437 2983 state_mem.go:96] "Updated CPUSet assignments" assignments={} Oct 28 00:15:22.405952 kubelet[2983]: I1028 00:15:22.405449 2983 policy_none.go:49] "None policy: Start" Oct 28 00:15:22.405952 kubelet[2983]: I1028 00:15:22.405454 2983 memory_manager.go:187] "Starting memorymanager" policy="None" Oct 28 00:15:22.405952 kubelet[2983]: I1028 00:15:22.405460 2983 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Oct 28 00:15:22.405952 kubelet[2983]: I1028 00:15:22.405533 2983 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Oct 28 00:15:22.405952 kubelet[2983]: I1028 00:15:22.405540 2983 policy_none.go:47] "Start" Oct 28 00:15:22.408697 kubelet[2983]: E1028 00:15:22.408637 2983 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 28 00:15:22.408750 kubelet[2983]: I1028 00:15:22.408732 2983 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 28 00:15:22.408750 kubelet[2983]: I1028 00:15:22.408738 2983 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 28 00:15:22.409058 kubelet[2983]: I1028 00:15:22.409044 2983 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 28 00:15:22.417125 kubelet[2983]: E1028 00:15:22.415905 2983 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 28 00:15:22.467020 kubelet[2983]: I1028 00:15:22.465249 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:22.467020 kubelet[2983]: I1028 00:15:22.465463 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:22.467020 kubelet[2983]: I1028 00:15:22.465574 2983 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 28 00:15:22.471625 kubelet[2983]: E1028 00:15:22.471598 2983 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Oct 28 00:15:22.517042 kubelet[2983]: I1028 00:15:22.516802 2983 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 28 00:15:22.520997 kubelet[2983]: I1028 00:15:22.520952 2983 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Oct 28 00:15:22.521354 kubelet[2983]: I1028 00:15:22.521332 2983 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Oct 28 00:15:22.662182 kubelet[2983]: I1028 00:15:22.662153 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:22.662182 kubelet[2983]: I1028 00:15:22.662178 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72ae43bf624d285361487631af8a6ba6-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72ae43bf624d285361487631af8a6ba6\") " pod="kube-system/kube-scheduler-localhost" Oct 28 00:15:22.662292 kubelet[2983]: I1028 00:15:22.662192 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:22.662292 kubelet[2983]: I1028 00:15:22.662201 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0aec103f82290345bcc057479e5c6dde-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0aec103f82290345bcc057479e5c6dde\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:22.662292 kubelet[2983]: I1028 00:15:22.662217 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0aec103f82290345bcc057479e5c6dde-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0aec103f82290345bcc057479e5c6dde\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:22.662292 kubelet[2983]: I1028 00:15:22.662229 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0aec103f82290345bcc057479e5c6dde-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0aec103f82290345bcc057479e5c6dde\") " pod="kube-system/kube-apiserver-localhost" Oct 28 00:15:22.662292 kubelet[2983]: I1028 00:15:22.662243 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:22.662373 kubelet[2983]: I1028 00:15:22.662253 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:22.662373 kubelet[2983]: I1028 00:15:22.662262 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ce161b3b11c90b0b844f2e4f86b4e8cd-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"ce161b3b11c90b0b844f2e4f86b4e8cd\") " pod="kube-system/kube-controller-manager-localhost" Oct 28 00:15:23.327209 kubelet[2983]: I1028 00:15:23.327169 2983 apiserver.go:52] "Watching apiserver" Oct 28 00:15:23.364484 kubelet[2983]: I1028 00:15:23.364448 2983 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 28 00:15:23.409175 kubelet[2983]: I1028 00:15:23.409129 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.409116755 podStartE2EDuration="3.409116755s" podCreationTimestamp="2025-10-28 00:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:15:23.404845357 +0000 UTC m=+1.159397703" watchObservedRunningTime="2025-10-28 00:15:23.409116755 +0000 UTC m=+1.163669098" Oct 28 00:15:23.413942 kubelet[2983]: I1028 00:15:23.413902 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.4138887740000001 podStartE2EDuration="1.413888774s" podCreationTimestamp="2025-10-28 00:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:15:23.409266872 +0000 UTC m=+1.163819221" watchObservedRunningTime="2025-10-28 00:15:23.413888774 +0000 UTC m=+1.168441115" Oct 28 00:15:23.419709 kubelet[2983]: I1028 00:15:23.419676 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.41966609 podStartE2EDuration="1.41966609s" podCreationTimestamp="2025-10-28 00:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:15:23.41406879 +0000 UTC m=+1.168621140" watchObservedRunningTime="2025-10-28 00:15:23.41966609 +0000 UTC m=+1.174218440" Oct 28 00:15:28.639359 kubelet[2983]: I1028 00:15:28.639337 2983 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Oct 28 00:15:28.640156 containerd[1690]: time="2025-10-28T00:15:28.639883180Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Oct 28 00:15:28.640306 kubelet[2983]: I1028 00:15:28.640024 2983 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Oct 28 00:15:29.616109 systemd[1]: Created slice kubepods-besteffort-pod2887c18d_f049_42dc_8152_47216d889ad1.slice - libcontainer container kubepods-besteffort-pod2887c18d_f049_42dc_8152_47216d889ad1.slice. Oct 28 00:15:29.707225 kubelet[2983]: I1028 00:15:29.707188 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wff4j\" (UniqueName: \"kubernetes.io/projected/2887c18d-f049-42dc-8152-47216d889ad1-kube-api-access-wff4j\") pod \"kube-proxy-2wn99\" (UID: \"2887c18d-f049-42dc-8152-47216d889ad1\") " pod="kube-system/kube-proxy-2wn99" Oct 28 00:15:29.707505 kubelet[2983]: I1028 00:15:29.707232 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2887c18d-f049-42dc-8152-47216d889ad1-kube-proxy\") pod \"kube-proxy-2wn99\" (UID: \"2887c18d-f049-42dc-8152-47216d889ad1\") " pod="kube-system/kube-proxy-2wn99" Oct 28 00:15:29.707505 kubelet[2983]: I1028 00:15:29.707251 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2887c18d-f049-42dc-8152-47216d889ad1-xtables-lock\") pod \"kube-proxy-2wn99\" (UID: \"2887c18d-f049-42dc-8152-47216d889ad1\") " pod="kube-system/kube-proxy-2wn99" Oct 28 00:15:29.707505 kubelet[2983]: I1028 00:15:29.707265 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2887c18d-f049-42dc-8152-47216d889ad1-lib-modules\") pod \"kube-proxy-2wn99\" (UID: \"2887c18d-f049-42dc-8152-47216d889ad1\") " pod="kube-system/kube-proxy-2wn99" Oct 28 00:15:29.876171 systemd[1]: Created slice kubepods-besteffort-pod697052c9_d7d6_4cbe_9b64_470caa85a6e3.slice - libcontainer container kubepods-besteffort-pod697052c9_d7d6_4cbe_9b64_470caa85a6e3.slice. Oct 28 00:15:29.908648 kubelet[2983]: I1028 00:15:29.908471 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/697052c9-d7d6-4cbe-9b64-470caa85a6e3-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-7t64r\" (UID: \"697052c9-d7d6-4cbe-9b64-470caa85a6e3\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-7t64r" Oct 28 00:15:29.908648 kubelet[2983]: I1028 00:15:29.908623 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfhlk\" (UniqueName: \"kubernetes.io/projected/697052c9-d7d6-4cbe-9b64-470caa85a6e3-kube-api-access-nfhlk\") pod \"tigera-operator-65cdcdfd6d-7t64r\" (UID: \"697052c9-d7d6-4cbe-9b64-470caa85a6e3\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-7t64r" Oct 28 00:15:29.936521 containerd[1690]: time="2025-10-28T00:15:29.936431771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2wn99,Uid:2887c18d-f049-42dc-8152-47216d889ad1,Namespace:kube-system,Attempt:0,}" Oct 28 00:15:30.025298 containerd[1690]: time="2025-10-28T00:15:30.025226509Z" level=info msg="connecting to shim 44599c87cdd181ad93ae7aecec8e76ba5a15d3eacb359d313c1944a91c6e0d93" address="unix:///run/containerd/s/962f92a352bcb4a832b5104cba4ee3383d267518b1569bda58d012a0e619a2f3" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:15:30.052730 systemd[1]: Started cri-containerd-44599c87cdd181ad93ae7aecec8e76ba5a15d3eacb359d313c1944a91c6e0d93.scope - libcontainer container 44599c87cdd181ad93ae7aecec8e76ba5a15d3eacb359d313c1944a91c6e0d93. Oct 28 00:15:30.075595 containerd[1690]: time="2025-10-28T00:15:30.075572777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2wn99,Uid:2887c18d-f049-42dc-8152-47216d889ad1,Namespace:kube-system,Attempt:0,} returns sandbox id \"44599c87cdd181ad93ae7aecec8e76ba5a15d3eacb359d313c1944a91c6e0d93\"" Oct 28 00:15:30.095789 containerd[1690]: time="2025-10-28T00:15:30.095767729Z" level=info msg="CreateContainer within sandbox \"44599c87cdd181ad93ae7aecec8e76ba5a15d3eacb359d313c1944a91c6e0d93\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Oct 28 00:15:30.150875 containerd[1690]: time="2025-10-28T00:15:30.150803829Z" level=info msg="Container c51bea215f95cc2428119f495f2db6e06304ad7cefde64c5583c91436fe933dc: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:30.198683 containerd[1690]: time="2025-10-28T00:15:30.198650074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-7t64r,Uid:697052c9-d7d6-4cbe-9b64-470caa85a6e3,Namespace:tigera-operator,Attempt:0,}" Oct 28 00:15:30.265616 containerd[1690]: time="2025-10-28T00:15:30.265488124Z" level=info msg="CreateContainer within sandbox \"44599c87cdd181ad93ae7aecec8e76ba5a15d3eacb359d313c1944a91c6e0d93\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c51bea215f95cc2428119f495f2db6e06304ad7cefde64c5583c91436fe933dc\"" Oct 28 00:15:30.266308 containerd[1690]: time="2025-10-28T00:15:30.266188075Z" level=info msg="StartContainer for \"c51bea215f95cc2428119f495f2db6e06304ad7cefde64c5583c91436fe933dc\"" Oct 28 00:15:30.267775 containerd[1690]: time="2025-10-28T00:15:30.267727542Z" level=info msg="connecting to shim c51bea215f95cc2428119f495f2db6e06304ad7cefde64c5583c91436fe933dc" address="unix:///run/containerd/s/962f92a352bcb4a832b5104cba4ee3383d267518b1569bda58d012a0e619a2f3" protocol=ttrpc version=3 Oct 28 00:15:30.276177 containerd[1690]: time="2025-10-28T00:15:30.276097192Z" level=info msg="connecting to shim b03e618d4d9e55bef6ffd364e0575a394d3e1727b2424cb1791e547d90d7e849" address="unix:///run/containerd/s/b2d11404e27af0991ea549235afce8ff3a2733a64132cb817f9cb72b0bf2c82a" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:15:30.284701 systemd[1]: Started cri-containerd-c51bea215f95cc2428119f495f2db6e06304ad7cefde64c5583c91436fe933dc.scope - libcontainer container c51bea215f95cc2428119f495f2db6e06304ad7cefde64c5583c91436fe933dc. Oct 28 00:15:30.308632 systemd[1]: Started cri-containerd-b03e618d4d9e55bef6ffd364e0575a394d3e1727b2424cb1791e547d90d7e849.scope - libcontainer container b03e618d4d9e55bef6ffd364e0575a394d3e1727b2424cb1791e547d90d7e849. Oct 28 00:15:30.334115 containerd[1690]: time="2025-10-28T00:15:30.334057586Z" level=info msg="StartContainer for \"c51bea215f95cc2428119f495f2db6e06304ad7cefde64c5583c91436fe933dc\" returns successfully" Oct 28 00:15:30.346994 containerd[1690]: time="2025-10-28T00:15:30.346955684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-7t64r,Uid:697052c9-d7d6-4cbe-9b64-470caa85a6e3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"b03e618d4d9e55bef6ffd364e0575a394d3e1727b2424cb1791e547d90d7e849\"" Oct 28 00:15:30.348929 containerd[1690]: time="2025-10-28T00:15:30.348797336Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Oct 28 00:15:30.855847 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3938807766.mount: Deactivated successfully. Oct 28 00:15:31.917697 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1785596200.mount: Deactivated successfully. Oct 28 00:15:33.236259 containerd[1690]: time="2025-10-28T00:15:33.235847291Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:33.236259 containerd[1690]: time="2025-10-28T00:15:33.236224843Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Oct 28 00:15:33.236259 containerd[1690]: time="2025-10-28T00:15:33.236235612Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:33.237275 containerd[1690]: time="2025-10-28T00:15:33.237262831Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:33.237671 containerd[1690]: time="2025-10-28T00:15:33.237654652Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.888569074s" Oct 28 00:15:33.237703 containerd[1690]: time="2025-10-28T00:15:33.237672016Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Oct 28 00:15:33.239656 containerd[1690]: time="2025-10-28T00:15:33.239640537Z" level=info msg="CreateContainer within sandbox \"b03e618d4d9e55bef6ffd364e0575a394d3e1727b2424cb1791e547d90d7e849\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Oct 28 00:15:33.244403 containerd[1690]: time="2025-10-28T00:15:33.244003686Z" level=info msg="Container 683be2c199146d4c0d4e210bd8723bc36ad1241fda5b3a50e328d21dd9848f24: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:33.259547 containerd[1690]: time="2025-10-28T00:15:33.259520454Z" level=info msg="CreateContainer within sandbox \"b03e618d4d9e55bef6ffd364e0575a394d3e1727b2424cb1791e547d90d7e849\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"683be2c199146d4c0d4e210bd8723bc36ad1241fda5b3a50e328d21dd9848f24\"" Oct 28 00:15:33.260501 containerd[1690]: time="2025-10-28T00:15:33.260424930Z" level=info msg="StartContainer for \"683be2c199146d4c0d4e210bd8723bc36ad1241fda5b3a50e328d21dd9848f24\"" Oct 28 00:15:33.261244 containerd[1690]: time="2025-10-28T00:15:33.261228047Z" level=info msg="connecting to shim 683be2c199146d4c0d4e210bd8723bc36ad1241fda5b3a50e328d21dd9848f24" address="unix:///run/containerd/s/b2d11404e27af0991ea549235afce8ff3a2733a64132cb817f9cb72b0bf2c82a" protocol=ttrpc version=3 Oct 28 00:15:33.279622 systemd[1]: Started cri-containerd-683be2c199146d4c0d4e210bd8723bc36ad1241fda5b3a50e328d21dd9848f24.scope - libcontainer container 683be2c199146d4c0d4e210bd8723bc36ad1241fda5b3a50e328d21dd9848f24. Oct 28 00:15:33.300838 containerd[1690]: time="2025-10-28T00:15:33.300806153Z" level=info msg="StartContainer for \"683be2c199146d4c0d4e210bd8723bc36ad1241fda5b3a50e328d21dd9848f24\" returns successfully" Oct 28 00:15:33.419554 kubelet[2983]: I1028 00:15:33.419481 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-2wn99" podStartSLOduration=4.41946451 podStartE2EDuration="4.41946451s" podCreationTimestamp="2025-10-28 00:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:15:30.41896087 +0000 UTC m=+8.173513219" watchObservedRunningTime="2025-10-28 00:15:33.41946451 +0000 UTC m=+11.174016859" Oct 28 00:15:33.420877 kubelet[2983]: I1028 00:15:33.420568 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-7t64r" podStartSLOduration=1.530538283 podStartE2EDuration="4.420556938s" podCreationTimestamp="2025-10-28 00:15:29 +0000 UTC" firstStartedPulling="2025-10-28 00:15:30.348231729 +0000 UTC m=+8.102784067" lastFinishedPulling="2025-10-28 00:15:33.238250384 +0000 UTC m=+10.992802722" observedRunningTime="2025-10-28 00:15:33.420375926 +0000 UTC m=+11.174928277" watchObservedRunningTime="2025-10-28 00:15:33.420556938 +0000 UTC m=+11.175109288" Oct 28 00:15:38.677959 sudo[2000]: pam_unix(sudo:session): session closed for user root Oct 28 00:15:38.681032 sshd[1999]: Connection closed by 139.178.68.195 port 58766 Oct 28 00:15:38.680969 sshd-session[1993]: pam_unix(sshd:session): session closed for user core Oct 28 00:15:38.684126 systemd[1]: sshd@6-139.178.70.103:22-139.178.68.195:58766.service: Deactivated successfully. Oct 28 00:15:38.686819 systemd[1]: session-9.scope: Deactivated successfully. Oct 28 00:15:38.687773 systemd[1]: session-9.scope: Consumed 3.683s CPU time, 158.1M memory peak. Oct 28 00:15:38.689282 systemd-logind[1652]: Session 9 logged out. Waiting for processes to exit. Oct 28 00:15:38.691862 systemd-logind[1652]: Removed session 9. Oct 28 00:15:42.563437 systemd[1]: Created slice kubepods-besteffort-podbe41c11a_87b5_4ad8_8521_c145550d7cb7.slice - libcontainer container kubepods-besteffort-podbe41c11a_87b5_4ad8_8521_c145550d7cb7.slice. Oct 28 00:15:42.683480 kubelet[2983]: I1028 00:15:42.683445 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be41c11a-87b5-4ad8-8521-c145550d7cb7-tigera-ca-bundle\") pod \"calico-typha-bdd55d46-cj5lp\" (UID: \"be41c11a-87b5-4ad8-8521-c145550d7cb7\") " pod="calico-system/calico-typha-bdd55d46-cj5lp" Oct 28 00:15:42.683868 kubelet[2983]: I1028 00:15:42.683625 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdtx\" (UniqueName: \"kubernetes.io/projected/be41c11a-87b5-4ad8-8521-c145550d7cb7-kube-api-access-8mdtx\") pod \"calico-typha-bdd55d46-cj5lp\" (UID: \"be41c11a-87b5-4ad8-8521-c145550d7cb7\") " pod="calico-system/calico-typha-bdd55d46-cj5lp" Oct 28 00:15:42.683868 kubelet[2983]: I1028 00:15:42.683643 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/be41c11a-87b5-4ad8-8521-c145550d7cb7-typha-certs\") pod \"calico-typha-bdd55d46-cj5lp\" (UID: \"be41c11a-87b5-4ad8-8521-c145550d7cb7\") " pod="calico-system/calico-typha-bdd55d46-cj5lp" Oct 28 00:15:42.756402 systemd[1]: Created slice kubepods-besteffort-podecaa5059_d97a_4a75_911e_eea576e23f0b.slice - libcontainer container kubepods-besteffort-podecaa5059_d97a_4a75_911e_eea576e23f0b.slice. Oct 28 00:15:42.868590 containerd[1690]: time="2025-10-28T00:15:42.868478473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bdd55d46-cj5lp,Uid:be41c11a-87b5-4ad8-8521-c145550d7cb7,Namespace:calico-system,Attempt:0,}" Oct 28 00:15:42.879046 containerd[1690]: time="2025-10-28T00:15:42.879013366Z" level=info msg="connecting to shim 3f036b9cb1c789fcdebb294aa5e98b00de42d6c52509edf426da326c6049d448" address="unix:///run/containerd/s/5fa7628417b89c3361f471280628e03eca3099348986ad1f4c937486ecf86b4e" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:15:42.885519 kubelet[2983]: I1028 00:15:42.884639 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-cni-bin-dir\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885519 kubelet[2983]: I1028 00:15:42.884667 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-cni-log-dir\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885519 kubelet[2983]: I1028 00:15:42.884687 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecaa5059-d97a-4a75-911e-eea576e23f0b-tigera-ca-bundle\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885519 kubelet[2983]: I1028 00:15:42.884707 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-xtables-lock\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885519 kubelet[2983]: I1028 00:15:42.884717 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-cni-net-dir\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885657 kubelet[2983]: I1028 00:15:42.884725 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-lib-modules\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885657 kubelet[2983]: I1028 00:15:42.884744 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-var-run-calico\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885657 kubelet[2983]: I1028 00:15:42.884760 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ecaa5059-d97a-4a75-911e-eea576e23f0b-node-certs\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885657 kubelet[2983]: I1028 00:15:42.884769 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-policysync\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885657 kubelet[2983]: I1028 00:15:42.884777 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-var-lib-calico\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885745 kubelet[2983]: I1028 00:15:42.884785 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ecaa5059-d97a-4a75-911e-eea576e23f0b-flexvol-driver-host\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.885745 kubelet[2983]: I1028 00:15:42.884795 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8gtb\" (UniqueName: \"kubernetes.io/projected/ecaa5059-d97a-4a75-911e-eea576e23f0b-kube-api-access-c8gtb\") pod \"calico-node-mjh2g\" (UID: \"ecaa5059-d97a-4a75-911e-eea576e23f0b\") " pod="calico-system/calico-node-mjh2g" Oct 28 00:15:42.897597 systemd[1]: Started cri-containerd-3f036b9cb1c789fcdebb294aa5e98b00de42d6c52509edf426da326c6049d448.scope - libcontainer container 3f036b9cb1c789fcdebb294aa5e98b00de42d6c52509edf426da326c6049d448. Oct 28 00:15:42.940351 kubelet[2983]: E1028 00:15:42.940125 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:42.961731 containerd[1690]: time="2025-10-28T00:15:42.961702338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bdd55d46-cj5lp,Uid:be41c11a-87b5-4ad8-8521-c145550d7cb7,Namespace:calico-system,Attempt:0,} returns sandbox id \"3f036b9cb1c789fcdebb294aa5e98b00de42d6c52509edf426da326c6049d448\"" Oct 28 00:15:42.962948 containerd[1690]: time="2025-10-28T00:15:42.962930061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Oct 28 00:15:43.028844 kubelet[2983]: E1028 00:15:43.028772 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.028844 kubelet[2983]: W1028 00:15:43.028798 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.028844 kubelet[2983]: E1028 00:15:43.028815 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.072947 containerd[1690]: time="2025-10-28T00:15:43.072921965Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mjh2g,Uid:ecaa5059-d97a-4a75-911e-eea576e23f0b,Namespace:calico-system,Attempt:0,}" Oct 28 00:15:43.087454 kubelet[2983]: E1028 00:15:43.087403 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.087454 kubelet[2983]: W1028 00:15:43.087421 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.087454 kubelet[2983]: E1028 00:15:43.087437 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.087778 kubelet[2983]: I1028 00:15:43.087656 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gg4h\" (UniqueName: \"kubernetes.io/projected/f0632c4d-dd7d-4536-90e2-d70b340d8f15-kube-api-access-4gg4h\") pod \"csi-node-driver-7g789\" (UID: \"f0632c4d-dd7d-4536-90e2-d70b340d8f15\") " pod="calico-system/csi-node-driver-7g789" Oct 28 00:15:43.087869 kubelet[2983]: E1028 00:15:43.087844 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.087869 kubelet[2983]: W1028 00:15:43.087853 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.087869 kubelet[2983]: E1028 00:15:43.087861 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.088132 kubelet[2983]: E1028 00:15:43.088079 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.088132 kubelet[2983]: W1028 00:15:43.088087 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.088132 kubelet[2983]: E1028 00:15:43.088094 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.088309 kubelet[2983]: E1028 00:15:43.088274 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.088309 kubelet[2983]: W1028 00:15:43.088281 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.088309 kubelet[2983]: E1028 00:15:43.088287 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.088456 kubelet[2983]: I1028 00:15:43.088367 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f0632c4d-dd7d-4536-90e2-d70b340d8f15-registration-dir\") pod \"csi-node-driver-7g789\" (UID: \"f0632c4d-dd7d-4536-90e2-d70b340d8f15\") " pod="calico-system/csi-node-driver-7g789" Oct 28 00:15:43.088585 kubelet[2983]: E1028 00:15:43.088563 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.088585 kubelet[2983]: W1028 00:15:43.088570 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.088585 kubelet[2983]: E1028 00:15:43.088578 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.088759 kubelet[2983]: I1028 00:15:43.088690 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f0632c4d-dd7d-4536-90e2-d70b340d8f15-kubelet-dir\") pod \"csi-node-driver-7g789\" (UID: \"f0632c4d-dd7d-4536-90e2-d70b340d8f15\") " pod="calico-system/csi-node-driver-7g789" Oct 28 00:15:43.088867 kubelet[2983]: E1028 00:15:43.088861 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.088973 kubelet[2983]: W1028 00:15:43.088912 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.088973 kubelet[2983]: E1028 00:15:43.088922 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.088973 kubelet[2983]: I1028 00:15:43.088941 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f0632c4d-dd7d-4536-90e2-d70b340d8f15-socket-dir\") pod \"csi-node-driver-7g789\" (UID: \"f0632c4d-dd7d-4536-90e2-d70b340d8f15\") " pod="calico-system/csi-node-driver-7g789" Oct 28 00:15:43.089560 kubelet[2983]: E1028 00:15:43.089536 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.089560 kubelet[2983]: W1028 00:15:43.089545 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.089560 kubelet[2983]: E1028 00:15:43.089552 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.089727 kubelet[2983]: I1028 00:15:43.089669 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f0632c4d-dd7d-4536-90e2-d70b340d8f15-varrun\") pod \"csi-node-driver-7g789\" (UID: \"f0632c4d-dd7d-4536-90e2-d70b340d8f15\") " pod="calico-system/csi-node-driver-7g789" Oct 28 00:15:43.089860 kubelet[2983]: E1028 00:15:43.089836 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.089860 kubelet[2983]: W1028 00:15:43.089844 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.089860 kubelet[2983]: E1028 00:15:43.089852 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.090085 kubelet[2983]: E1028 00:15:43.090064 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.090085 kubelet[2983]: W1028 00:15:43.090071 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.090085 kubelet[2983]: E1028 00:15:43.090077 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.091654 kubelet[2983]: E1028 00:15:43.091625 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.091654 kubelet[2983]: W1028 00:15:43.091634 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.091654 kubelet[2983]: E1028 00:15:43.091642 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.091950 kubelet[2983]: E1028 00:15:43.091895 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.091950 kubelet[2983]: W1028 00:15:43.091902 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.091950 kubelet[2983]: E1028 00:15:43.091909 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.092114 kubelet[2983]: E1028 00:15:43.092091 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.092114 kubelet[2983]: W1028 00:15:43.092098 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.092114 kubelet[2983]: E1028 00:15:43.092105 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.092350 kubelet[2983]: E1028 00:15:43.092326 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.092350 kubelet[2983]: W1028 00:15:43.092334 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.092350 kubelet[2983]: E1028 00:15:43.092340 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.092588 kubelet[2983]: E1028 00:15:43.092567 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.092588 kubelet[2983]: W1028 00:15:43.092574 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.092588 kubelet[2983]: E1028 00:15:43.092580 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.092807 kubelet[2983]: E1028 00:15:43.092783 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.092807 kubelet[2983]: W1028 00:15:43.092790 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.092807 kubelet[2983]: E1028 00:15:43.092796 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.120653 containerd[1690]: time="2025-10-28T00:15:43.119629725Z" level=info msg="connecting to shim c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd" address="unix:///run/containerd/s/d1272acb1da80ac89c16ef0a2b5cadeaa6febb5711398ad3ecf9ea4951fcc1af" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:15:43.153652 systemd[1]: Started cri-containerd-c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd.scope - libcontainer container c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd. Oct 28 00:15:43.174883 containerd[1690]: time="2025-10-28T00:15:43.174848603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mjh2g,Uid:ecaa5059-d97a-4a75-911e-eea576e23f0b,Namespace:calico-system,Attempt:0,} returns sandbox id \"c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd\"" Oct 28 00:15:43.191880 kubelet[2983]: E1028 00:15:43.191761 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.191880 kubelet[2983]: W1028 00:15:43.191779 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.191880 kubelet[2983]: E1028 00:15:43.191792 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.192115 kubelet[2983]: E1028 00:15:43.192071 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.192115 kubelet[2983]: W1028 00:15:43.192081 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.192115 kubelet[2983]: E1028 00:15:43.192089 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.192260 kubelet[2983]: E1028 00:15:43.192239 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.192260 kubelet[2983]: W1028 00:15:43.192257 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.192315 kubelet[2983]: E1028 00:15:43.192268 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.192395 kubelet[2983]: E1028 00:15:43.192381 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.192395 kubelet[2983]: W1028 00:15:43.192391 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.192395 kubelet[2983]: E1028 00:15:43.192399 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.192544 kubelet[2983]: E1028 00:15:43.192530 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.192544 kubelet[2983]: W1028 00:15:43.192539 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.192626 kubelet[2983]: E1028 00:15:43.192547 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.193846 kubelet[2983]: E1028 00:15:43.193820 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.193846 kubelet[2983]: W1028 00:15:43.193840 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.193979 kubelet[2983]: E1028 00:15:43.193867 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.194093 kubelet[2983]: E1028 00:15:43.194081 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.194127 kubelet[2983]: W1028 00:15:43.194098 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.194127 kubelet[2983]: E1028 00:15:43.194105 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.194279 kubelet[2983]: E1028 00:15:43.194265 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.194279 kubelet[2983]: W1028 00:15:43.194275 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.194372 kubelet[2983]: E1028 00:15:43.194280 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.194395 kubelet[2983]: E1028 00:15:43.194391 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195127 kubelet[2983]: W1028 00:15:43.194396 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195127 kubelet[2983]: E1028 00:15:43.194401 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.195127 kubelet[2983]: E1028 00:15:43.194532 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195127 kubelet[2983]: W1028 00:15:43.194548 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195127 kubelet[2983]: E1028 00:15:43.194564 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.195127 kubelet[2983]: E1028 00:15:43.194676 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195127 kubelet[2983]: W1028 00:15:43.194683 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195127 kubelet[2983]: E1028 00:15:43.194688 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.195127 kubelet[2983]: E1028 00:15:43.194795 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195127 kubelet[2983]: W1028 00:15:43.194800 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195352 kubelet[2983]: E1028 00:15:43.194808 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.195352 kubelet[2983]: E1028 00:15:43.194929 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195352 kubelet[2983]: W1028 00:15:43.194936 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195352 kubelet[2983]: E1028 00:15:43.194943 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.195541 kubelet[2983]: E1028 00:15:43.195447 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195541 kubelet[2983]: W1028 00:15:43.195455 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195541 kubelet[2983]: E1028 00:15:43.195462 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.195965 kubelet[2983]: E1028 00:15:43.195597 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195965 kubelet[2983]: W1028 00:15:43.195602 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195965 kubelet[2983]: E1028 00:15:43.195607 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.195965 kubelet[2983]: E1028 00:15:43.195749 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.195965 kubelet[2983]: W1028 00:15:43.195756 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.195965 kubelet[2983]: E1028 00:15:43.195763 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.198369 kubelet[2983]: E1028 00:15:43.197645 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.198369 kubelet[2983]: W1028 00:15:43.197666 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.198369 kubelet[2983]: E1028 00:15:43.197688 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.198369 kubelet[2983]: E1028 00:15:43.198217 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.198369 kubelet[2983]: W1028 00:15:43.198229 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.198369 kubelet[2983]: E1028 00:15:43.198248 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.198369 kubelet[2983]: E1028 00:15:43.198379 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.198369 kubelet[2983]: W1028 00:15:43.198385 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.198701 kubelet[2983]: E1028 00:15:43.198391 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.198701 kubelet[2983]: E1028 00:15:43.198478 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.198701 kubelet[2983]: W1028 00:15:43.198484 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.198701 kubelet[2983]: E1028 00:15:43.198572 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.198701 kubelet[2983]: E1028 00:15:43.198688 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.198701 kubelet[2983]: W1028 00:15:43.198693 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.198701 kubelet[2983]: E1028 00:15:43.198698 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.199193 kubelet[2983]: E1028 00:15:43.199180 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.199193 kubelet[2983]: W1028 00:15:43.199191 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.199253 kubelet[2983]: E1028 00:15:43.199200 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.199303 kubelet[2983]: E1028 00:15:43.199293 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.199303 kubelet[2983]: W1028 00:15:43.199300 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.199357 kubelet[2983]: E1028 00:15:43.199305 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.199392 kubelet[2983]: E1028 00:15:43.199378 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.199392 kubelet[2983]: W1028 00:15:43.199387 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.199392 kubelet[2983]: E1028 00:15:43.199392 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.199869 kubelet[2983]: E1028 00:15:43.199859 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.199905 kubelet[2983]: W1028 00:15:43.199870 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.199905 kubelet[2983]: E1028 00:15:43.199880 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:43.206296 kubelet[2983]: E1028 00:15:43.206245 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:43.206296 kubelet[2983]: W1028 00:15:43.206258 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:43.206296 kubelet[2983]: E1028 00:15:43.206271 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:44.366631 kubelet[2983]: E1028 00:15:44.365530 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:44.644724 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2728797960.mount: Deactivated successfully. Oct 28 00:15:46.007481 containerd[1690]: time="2025-10-28T00:15:46.007439228Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:46.021225 containerd[1690]: time="2025-10-28T00:15:46.021186845Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Oct 28 00:15:46.037716 containerd[1690]: time="2025-10-28T00:15:46.037676556Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:46.046960 containerd[1690]: time="2025-10-28T00:15:46.046895200Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:46.047506 containerd[1690]: time="2025-10-28T00:15:46.047296843Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.084346129s" Oct 28 00:15:46.047506 containerd[1690]: time="2025-10-28T00:15:46.047316253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Oct 28 00:15:46.048300 containerd[1690]: time="2025-10-28T00:15:46.048136880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Oct 28 00:15:46.080073 containerd[1690]: time="2025-10-28T00:15:46.080048230Z" level=info msg="CreateContainer within sandbox \"3f036b9cb1c789fcdebb294aa5e98b00de42d6c52509edf426da326c6049d448\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Oct 28 00:15:46.087794 containerd[1690]: time="2025-10-28T00:15:46.087607296Z" level=info msg="Container d005726ac95e19457507a9212b8e4c147e2f4d4774bd9d6183d386bcaeb6f67d: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:46.096182 containerd[1690]: time="2025-10-28T00:15:46.096159022Z" level=info msg="CreateContainer within sandbox \"3f036b9cb1c789fcdebb294aa5e98b00de42d6c52509edf426da326c6049d448\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d005726ac95e19457507a9212b8e4c147e2f4d4774bd9d6183d386bcaeb6f67d\"" Oct 28 00:15:46.097803 containerd[1690]: time="2025-10-28T00:15:46.097693679Z" level=info msg="StartContainer for \"d005726ac95e19457507a9212b8e4c147e2f4d4774bd9d6183d386bcaeb6f67d\"" Oct 28 00:15:46.099437 containerd[1690]: time="2025-10-28T00:15:46.099395630Z" level=info msg="connecting to shim d005726ac95e19457507a9212b8e4c147e2f4d4774bd9d6183d386bcaeb6f67d" address="unix:///run/containerd/s/5fa7628417b89c3361f471280628e03eca3099348986ad1f4c937486ecf86b4e" protocol=ttrpc version=3 Oct 28 00:15:46.138587 systemd[1]: Started cri-containerd-d005726ac95e19457507a9212b8e4c147e2f4d4774bd9d6183d386bcaeb6f67d.scope - libcontainer container d005726ac95e19457507a9212b8e4c147e2f4d4774bd9d6183d386bcaeb6f67d. Oct 28 00:15:46.182188 containerd[1690]: time="2025-10-28T00:15:46.182134689Z" level=info msg="StartContainer for \"d005726ac95e19457507a9212b8e4c147e2f4d4774bd9d6183d386bcaeb6f67d\" returns successfully" Oct 28 00:15:46.384815 kubelet[2983]: E1028 00:15:46.384768 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:46.474867 kubelet[2983]: I1028 00:15:46.473590 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-bdd55d46-cj5lp" podStartSLOduration=1.387996739 podStartE2EDuration="4.473577646s" podCreationTimestamp="2025-10-28 00:15:42 +0000 UTC" firstStartedPulling="2025-10-28 00:15:42.962474967 +0000 UTC m=+20.717027305" lastFinishedPulling="2025-10-28 00:15:46.048055874 +0000 UTC m=+23.802608212" observedRunningTime="2025-10-28 00:15:46.473389445 +0000 UTC m=+24.227941795" watchObservedRunningTime="2025-10-28 00:15:46.473577646 +0000 UTC m=+24.228129992" Oct 28 00:15:46.538333 kubelet[2983]: E1028 00:15:46.538222 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.538333 kubelet[2983]: W1028 00:15:46.538240 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.544333 kubelet[2983]: E1028 00:15:46.541920 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.544333 kubelet[2983]: E1028 00:15:46.542085 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.544333 kubelet[2983]: W1028 00:15:46.542093 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.544333 kubelet[2983]: E1028 00:15:46.542106 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.544333 kubelet[2983]: E1028 00:15:46.542206 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.544333 kubelet[2983]: W1028 00:15:46.542211 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.544333 kubelet[2983]: E1028 00:15:46.542218 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.552070 kubelet[2983]: E1028 00:15:46.552015 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.552070 kubelet[2983]: W1028 00:15:46.552037 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.552070 kubelet[2983]: E1028 00:15:46.552055 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.552504 kubelet[2983]: E1028 00:15:46.552472 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.552504 kubelet[2983]: W1028 00:15:46.552479 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.552504 kubelet[2983]: E1028 00:15:46.552485 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.558362 kubelet[2983]: E1028 00:15:46.552745 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.558362 kubelet[2983]: W1028 00:15:46.552749 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.558362 kubelet[2983]: E1028 00:15:46.552754 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.558362 kubelet[2983]: E1028 00:15:46.552832 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.558362 kubelet[2983]: W1028 00:15:46.552836 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.558362 kubelet[2983]: E1028 00:15:46.552841 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.558362 kubelet[2983]: E1028 00:15:46.552920 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.558362 kubelet[2983]: W1028 00:15:46.552925 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.558362 kubelet[2983]: E1028 00:15:46.552930 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.558362 kubelet[2983]: E1028 00:15:46.553009 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.558601 kubelet[2983]: W1028 00:15:46.553014 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.558601 kubelet[2983]: E1028 00:15:46.553018 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.558601 kubelet[2983]: E1028 00:15:46.553093 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.558601 kubelet[2983]: W1028 00:15:46.553097 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.558601 kubelet[2983]: E1028 00:15:46.553101 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.558601 kubelet[2983]: E1028 00:15:46.553174 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.558601 kubelet[2983]: W1028 00:15:46.553178 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.558601 kubelet[2983]: E1028 00:15:46.553184 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.558601 kubelet[2983]: E1028 00:15:46.553256 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.558601 kubelet[2983]: W1028 00:15:46.553261 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.560538 kubelet[2983]: E1028 00:15:46.553265 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.560538 kubelet[2983]: E1028 00:15:46.553346 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.560538 kubelet[2983]: W1028 00:15:46.553350 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.560538 kubelet[2983]: E1028 00:15:46.553354 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.560538 kubelet[2983]: E1028 00:15:46.553435 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.560538 kubelet[2983]: W1028 00:15:46.553439 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.560538 kubelet[2983]: E1028 00:15:46.553443 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.560538 kubelet[2983]: E1028 00:15:46.553531 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.560538 kubelet[2983]: W1028 00:15:46.553536 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.560538 kubelet[2983]: E1028 00:15:46.553541 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.619634 kubelet[2983]: E1028 00:15:46.619611 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.619634 kubelet[2983]: W1028 00:15:46.619627 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.619758 kubelet[2983]: E1028 00:15:46.619642 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.620787 kubelet[2983]: E1028 00:15:46.620774 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.620787 kubelet[2983]: W1028 00:15:46.620785 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.620930 kubelet[2983]: E1028 00:15:46.620797 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.620930 kubelet[2983]: E1028 00:15:46.620906 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.620930 kubelet[2983]: W1028 00:15:46.620911 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.620930 kubelet[2983]: E1028 00:15:46.620916 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.621033 kubelet[2983]: E1028 00:15:46.621024 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.621076 kubelet[2983]: W1028 00:15:46.621034 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.621076 kubelet[2983]: E1028 00:15:46.621041 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.621193 kubelet[2983]: E1028 00:15:46.621140 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.621193 kubelet[2983]: W1028 00:15:46.621144 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.621193 kubelet[2983]: E1028 00:15:46.621149 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.621269 kubelet[2983]: E1028 00:15:46.621230 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.621269 kubelet[2983]: W1028 00:15:46.621234 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.621269 kubelet[2983]: E1028 00:15:46.621241 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.621470 kubelet[2983]: E1028 00:15:46.621461 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.621470 kubelet[2983]: W1028 00:15:46.621468 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.622757 kubelet[2983]: E1028 00:15:46.621476 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.622757 kubelet[2983]: E1028 00:15:46.622510 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.622757 kubelet[2983]: W1028 00:15:46.622518 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.622757 kubelet[2983]: E1028 00:15:46.622525 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.622757 kubelet[2983]: E1028 00:15:46.622630 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.622757 kubelet[2983]: W1028 00:15:46.622635 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.622757 kubelet[2983]: E1028 00:15:46.622643 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.623021 kubelet[2983]: E1028 00:15:46.622950 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.623021 kubelet[2983]: W1028 00:15:46.622957 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.623021 kubelet[2983]: E1028 00:15:46.622963 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.623202 kubelet[2983]: E1028 00:15:46.623068 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.623202 kubelet[2983]: W1028 00:15:46.623073 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.623202 kubelet[2983]: E1028 00:15:46.623078 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.623387 kubelet[2983]: E1028 00:15:46.623379 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.623460 kubelet[2983]: W1028 00:15:46.623426 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.623460 kubelet[2983]: E1028 00:15:46.623436 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.624632 kubelet[2983]: E1028 00:15:46.624589 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.624632 kubelet[2983]: W1028 00:15:46.624599 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.624632 kubelet[2983]: E1028 00:15:46.624605 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.624828 kubelet[2983]: E1028 00:15:46.624791 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.624828 kubelet[2983]: W1028 00:15:46.624798 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.624828 kubelet[2983]: E1028 00:15:46.624804 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.624999 kubelet[2983]: E1028 00:15:46.624966 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.624999 kubelet[2983]: W1028 00:15:46.624972 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.624999 kubelet[2983]: E1028 00:15:46.624979 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.625175 kubelet[2983]: E1028 00:15:46.625131 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.625175 kubelet[2983]: W1028 00:15:46.625137 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.625175 kubelet[2983]: E1028 00:15:46.625142 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.625504 kubelet[2983]: E1028 00:15:46.625309 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.625504 kubelet[2983]: W1028 00:15:46.625315 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.625504 kubelet[2983]: E1028 00:15:46.625320 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:46.625663 kubelet[2983]: E1028 00:15:46.625657 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:46.625755 kubelet[2983]: W1028 00:15:46.625695 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:46.625798 kubelet[2983]: E1028 00:15:46.625790 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.459670 kubelet[2983]: E1028 00:15:47.459609 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.459670 kubelet[2983]: W1028 00:15:47.459626 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.459670 kubelet[2983]: E1028 00:15:47.459640 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.460328 kubelet[2983]: E1028 00:15:47.460037 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.460328 kubelet[2983]: W1028 00:15:47.460044 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.460328 kubelet[2983]: E1028 00:15:47.460051 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.460328 kubelet[2983]: E1028 00:15:47.460271 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.460328 kubelet[2983]: W1028 00:15:47.460277 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.460328 kubelet[2983]: E1028 00:15:47.460282 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.460619 kubelet[2983]: E1028 00:15:47.460524 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.460692 kubelet[2983]: W1028 00:15:47.460654 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.460692 kubelet[2983]: E1028 00:15:47.460663 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.460955 kubelet[2983]: E1028 00:15:47.460922 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.460955 kubelet[2983]: W1028 00:15:47.460930 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.460955 kubelet[2983]: E1028 00:15:47.460935 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.461134 kubelet[2983]: E1028 00:15:47.461127 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.461272 kubelet[2983]: W1028 00:15:47.461243 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.461272 kubelet[2983]: E1028 00:15:47.461251 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.461435 kubelet[2983]: E1028 00:15:47.461408 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.461521 kubelet[2983]: W1028 00:15:47.461480 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.461521 kubelet[2983]: E1028 00:15:47.461488 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.461734 kubelet[2983]: E1028 00:15:47.461693 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.461734 kubelet[2983]: W1028 00:15:47.461699 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.461734 kubelet[2983]: E1028 00:15:47.461705 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.461996 kubelet[2983]: E1028 00:15:47.461943 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.461996 kubelet[2983]: W1028 00:15:47.461950 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.461996 kubelet[2983]: E1028 00:15:47.461955 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.462150 kubelet[2983]: E1028 00:15:47.462100 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.462150 kubelet[2983]: W1028 00:15:47.462105 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.462150 kubelet[2983]: E1028 00:15:47.462110 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.462341 kubelet[2983]: E1028 00:15:47.462245 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.462341 kubelet[2983]: W1028 00:15:47.462250 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.462341 kubelet[2983]: E1028 00:15:47.462257 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.462520 kubelet[2983]: E1028 00:15:47.462514 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.462620 kubelet[2983]: W1028 00:15:47.462584 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.462620 kubelet[2983]: E1028 00:15:47.462593 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.462900 kubelet[2983]: E1028 00:15:47.462864 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.462900 kubelet[2983]: W1028 00:15:47.462871 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.462900 kubelet[2983]: E1028 00:15:47.462877 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.463174 kubelet[2983]: E1028 00:15:47.463136 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.463174 kubelet[2983]: W1028 00:15:47.463144 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.463174 kubelet[2983]: E1028 00:15:47.463151 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.463520 kubelet[2983]: E1028 00:15:47.463448 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.463520 kubelet[2983]: W1028 00:15:47.463462 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.463520 kubelet[2983]: E1028 00:15:47.463468 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.493740 containerd[1690]: time="2025-10-28T00:15:47.493454407Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:47.493990 containerd[1690]: time="2025-10-28T00:15:47.493958196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Oct 28 00:15:47.494900 containerd[1690]: time="2025-10-28T00:15:47.494198598Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:47.495197 containerd[1690]: time="2025-10-28T00:15:47.495183572Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:47.495693 containerd[1690]: time="2025-10-28T00:15:47.495667978Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.447469514s" Oct 28 00:15:47.495747 containerd[1690]: time="2025-10-28T00:15:47.495737133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Oct 28 00:15:47.498601 containerd[1690]: time="2025-10-28T00:15:47.498310282Z" level=info msg="CreateContainer within sandbox \"c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Oct 28 00:15:47.527415 kubelet[2983]: E1028 00:15:47.527400 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.527530 kubelet[2983]: W1028 00:15:47.527520 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.527596 kubelet[2983]: E1028 00:15:47.527587 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.527760 kubelet[2983]: E1028 00:15:47.527752 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.527807 kubelet[2983]: W1028 00:15:47.527801 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.527844 kubelet[2983]: E1028 00:15:47.527837 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.528027 kubelet[2983]: E1028 00:15:47.528022 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.528078 kubelet[2983]: W1028 00:15:47.528070 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.528124 kubelet[2983]: E1028 00:15:47.528116 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.528295 kubelet[2983]: E1028 00:15:47.528283 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.528295 kubelet[2983]: W1028 00:15:47.528293 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.528375 kubelet[2983]: E1028 00:15:47.528299 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.528424 kubelet[2983]: E1028 00:15:47.528413 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.528424 kubelet[2983]: W1028 00:15:47.528421 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.528539 kubelet[2983]: E1028 00:15:47.528426 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.528539 kubelet[2983]: E1028 00:15:47.528504 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.528539 kubelet[2983]: W1028 00:15:47.528509 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.528539 kubelet[2983]: E1028 00:15:47.528513 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.528697 kubelet[2983]: E1028 00:15:47.528625 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.528697 kubelet[2983]: W1028 00:15:47.528638 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.528697 kubelet[2983]: E1028 00:15:47.528644 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.528870 kubelet[2983]: E1028 00:15:47.528863 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.528940 kubelet[2983]: W1028 00:15:47.528901 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.528940 kubelet[2983]: E1028 00:15:47.528909 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.529068 kubelet[2983]: E1028 00:15:47.529062 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.529185 kubelet[2983]: W1028 00:15:47.529102 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.529185 kubelet[2983]: E1028 00:15:47.529151 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.529368 kubelet[2983]: E1028 00:15:47.529319 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.529368 kubelet[2983]: W1028 00:15:47.529326 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.529368 kubelet[2983]: E1028 00:15:47.529332 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.529521 kubelet[2983]: E1028 00:15:47.529513 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.529603 kubelet[2983]: W1028 00:15:47.529560 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.529603 kubelet[2983]: E1028 00:15:47.529568 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.529752 kubelet[2983]: E1028 00:15:47.529720 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.529752 kubelet[2983]: W1028 00:15:47.529726 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.529752 kubelet[2983]: E1028 00:15:47.529732 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.530043 kubelet[2983]: E1028 00:15:47.530035 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.530086 kubelet[2983]: W1028 00:15:47.530078 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.530138 kubelet[2983]: E1028 00:15:47.530132 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.530293 kubelet[2983]: E1028 00:15:47.530281 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.530339 kubelet[2983]: W1028 00:15:47.530327 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.530424 kubelet[2983]: E1028 00:15:47.530367 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.530511 kubelet[2983]: E1028 00:15:47.530504 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.530555 kubelet[2983]: W1028 00:15:47.530547 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.530738 kubelet[2983]: E1028 00:15:47.530586 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.530873 kubelet[2983]: E1028 00:15:47.530867 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.530915 kubelet[2983]: W1028 00:15:47.530909 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.530952 kubelet[2983]: E1028 00:15:47.530946 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.531506 kubelet[2983]: E1028 00:15:47.531484 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.531534 kubelet[2983]: W1028 00:15:47.531508 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.531688 kubelet[2983]: E1028 00:15:47.531517 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.536123 kubelet[2983]: E1028 00:15:47.536111 2983 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 28 00:15:47.536123 kubelet[2983]: W1028 00:15:47.536120 2983 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 28 00:15:47.536174 kubelet[2983]: E1028 00:15:47.536127 2983 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 28 00:15:47.562334 containerd[1690]: time="2025-10-28T00:15:47.561659310Z" level=info msg="Container 7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:47.564882 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount578818847.mount: Deactivated successfully. Oct 28 00:15:47.570170 containerd[1690]: time="2025-10-28T00:15:47.570146054Z" level=info msg="CreateContainer within sandbox \"c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2\"" Oct 28 00:15:47.570684 containerd[1690]: time="2025-10-28T00:15:47.570660018Z" level=info msg="StartContainer for \"7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2\"" Oct 28 00:15:47.571561 containerd[1690]: time="2025-10-28T00:15:47.571541031Z" level=info msg="connecting to shim 7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2" address="unix:///run/containerd/s/d1272acb1da80ac89c16ef0a2b5cadeaa6febb5711398ad3ecf9ea4951fcc1af" protocol=ttrpc version=3 Oct 28 00:15:47.591643 systemd[1]: Started cri-containerd-7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2.scope - libcontainer container 7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2. Oct 28 00:15:47.623952 containerd[1690]: time="2025-10-28T00:15:47.623872367Z" level=info msg="StartContainer for \"7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2\" returns successfully" Oct 28 00:15:47.628689 systemd[1]: cri-containerd-7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2.scope: Deactivated successfully. Oct 28 00:15:47.640508 containerd[1690]: time="2025-10-28T00:15:47.640469515Z" level=info msg="received exit event container_id:\"7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2\" id:\"7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2\" pid:3660 exited_at:{seconds:1761610547 nanos:630775482}" Oct 28 00:15:47.649000 containerd[1690]: time="2025-10-28T00:15:47.648948951Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2\" id:\"7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2\" pid:3660 exited_at:{seconds:1761610547 nanos:630775482}" Oct 28 00:15:47.662477 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7c137b797b29eeba5468d9e221e1da2b352d51252873de7c11d5f5c4081056d2-rootfs.mount: Deactivated successfully. Oct 28 00:15:48.365822 kubelet[2983]: E1028 00:15:48.365696 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:48.460094 containerd[1690]: time="2025-10-28T00:15:48.458421394Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Oct 28 00:15:50.365591 kubelet[2983]: E1028 00:15:50.365278 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:52.365968 kubelet[2983]: E1028 00:15:52.365604 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:54.005422 containerd[1690]: time="2025-10-28T00:15:54.004958259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:54.005422 containerd[1690]: time="2025-10-28T00:15:54.005405979Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Oct 28 00:15:54.006504 containerd[1690]: time="2025-10-28T00:15:54.005891173Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:54.007512 containerd[1690]: time="2025-10-28T00:15:54.007208030Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:15:54.007667 containerd[1690]: time="2025-10-28T00:15:54.007647600Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 5.549172313s" Oct 28 00:15:54.007705 containerd[1690]: time="2025-10-28T00:15:54.007668819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Oct 28 00:15:54.020454 containerd[1690]: time="2025-10-28T00:15:54.020426408Z" level=info msg="CreateContainer within sandbox \"c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Oct 28 00:15:54.026512 containerd[1690]: time="2025-10-28T00:15:54.025936452Z" level=info msg="Container 785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:15:54.040833 containerd[1690]: time="2025-10-28T00:15:54.040812112Z" level=info msg="CreateContainer within sandbox \"c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706\"" Oct 28 00:15:54.041324 containerd[1690]: time="2025-10-28T00:15:54.041312224Z" level=info msg="StartContainer for \"785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706\"" Oct 28 00:15:54.050228 containerd[1690]: time="2025-10-28T00:15:54.050203820Z" level=info msg="connecting to shim 785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706" address="unix:///run/containerd/s/d1272acb1da80ac89c16ef0a2b5cadeaa6febb5711398ad3ecf9ea4951fcc1af" protocol=ttrpc version=3 Oct 28 00:15:54.079680 systemd[1]: Started cri-containerd-785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706.scope - libcontainer container 785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706. Oct 28 00:15:54.112757 containerd[1690]: time="2025-10-28T00:15:54.112732662Z" level=info msg="StartContainer for \"785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706\" returns successfully" Oct 28 00:15:54.366422 kubelet[2983]: E1028 00:15:54.365933 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:55.546155 systemd[1]: cri-containerd-785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706.scope: Deactivated successfully. Oct 28 00:15:55.546350 systemd[1]: cri-containerd-785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706.scope: Consumed 350ms CPU time, 159.4M memory peak, 2.3M read from disk, 171.3M written to disk. Oct 28 00:15:55.560466 containerd[1690]: time="2025-10-28T00:15:55.560427082Z" level=info msg="received exit event container_id:\"785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706\" id:\"785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706\" pid:3719 exited_at:{seconds:1761610555 nanos:559759164}" Oct 28 00:15:55.561309 containerd[1690]: time="2025-10-28T00:15:55.560661053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706\" id:\"785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706\" pid:3719 exited_at:{seconds:1761610555 nanos:559759164}" Oct 28 00:15:55.600484 kubelet[2983]: I1028 00:15:55.600015 2983 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Oct 28 00:15:55.617740 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-785ec6c0a8b90578f652df00f6af293124321875b3c1deff5b722a850998b706-rootfs.mount: Deactivated successfully. Oct 28 00:15:55.649763 systemd[1]: Created slice kubepods-besteffort-pod2e94a5f4_db66_4447_85c5_aa63d52ab9be.slice - libcontainer container kubepods-besteffort-pod2e94a5f4_db66_4447_85c5_aa63d52ab9be.slice. Oct 28 00:15:55.666611 systemd[1]: Created slice kubepods-burstable-pod1bfca3fa_cc6e_4c33_891f_c87e1750ccb0.slice - libcontainer container kubepods-burstable-pod1bfca3fa_cc6e_4c33_891f_c87e1750ccb0.slice. Oct 28 00:15:55.687379 systemd[1]: Created slice kubepods-burstable-pod96fb014d_9989_4ee7_ac0f_1812a6692826.slice - libcontainer container kubepods-burstable-pod96fb014d_9989_4ee7_ac0f_1812a6692826.slice. Oct 28 00:15:55.713538 systemd[1]: Created slice kubepods-besteffort-podf000d983_3d67_44b2_b245_9b82c5b15b84.slice - libcontainer container kubepods-besteffort-podf000d983_3d67_44b2_b245_9b82c5b15b84.slice. Oct 28 00:15:55.722454 systemd[1]: Created slice kubepods-besteffort-podb2716210_391e_4394_893c_61a7addd4a59.slice - libcontainer container kubepods-besteffort-podb2716210_391e_4394_893c_61a7addd4a59.slice. Oct 28 00:15:55.728537 systemd[1]: Created slice kubepods-besteffort-podc5b1e8a1_8c67_475a_a1aa_b28128f9865a.slice - libcontainer container kubepods-besteffort-podc5b1e8a1_8c67_475a_a1aa_b28128f9865a.slice. Oct 28 00:15:55.734038 systemd[1]: Created slice kubepods-besteffort-podd333e87f_e9e2_4229_9f47_dc31abe7437e.slice - libcontainer container kubepods-besteffort-podd333e87f_e9e2_4229_9f47_dc31abe7437e.slice. Oct 28 00:15:55.780393 kubelet[2983]: I1028 00:15:55.780362 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsdx\" (UniqueName: \"kubernetes.io/projected/d333e87f-e9e2-4229-9f47-dc31abe7437e-kube-api-access-lbsdx\") pod \"whisker-6f4499d5c-4dzhh\" (UID: \"d333e87f-e9e2-4229-9f47-dc31abe7437e\") " pod="calico-system/whisker-6f4499d5c-4dzhh" Oct 28 00:15:55.780393 kubelet[2983]: I1028 00:15:55.780402 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f000d983-3d67-44b2-b245-9b82c5b15b84-config\") pod \"goldmane-7c778bb748-525qc\" (UID: \"f000d983-3d67-44b2-b245-9b82c5b15b84\") " pod="calico-system/goldmane-7c778bb748-525qc" Oct 28 00:15:55.781189 kubelet[2983]: I1028 00:15:55.780413 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f000d983-3d67-44b2-b245-9b82c5b15b84-goldmane-key-pair\") pod \"goldmane-7c778bb748-525qc\" (UID: \"f000d983-3d67-44b2-b245-9b82c5b15b84\") " pod="calico-system/goldmane-7c778bb748-525qc" Oct 28 00:15:55.781189 kubelet[2983]: I1028 00:15:55.780434 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-backend-key-pair\") pod \"whisker-6f4499d5c-4dzhh\" (UID: \"d333e87f-e9e2-4229-9f47-dc31abe7437e\") " pod="calico-system/whisker-6f4499d5c-4dzhh" Oct 28 00:15:55.781189 kubelet[2983]: I1028 00:15:55.780446 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e94a5f4-db66-4447-85c5-aa63d52ab9be-tigera-ca-bundle\") pod \"calico-kube-controllers-6f9599dc7b-jdzs7\" (UID: \"2e94a5f4-db66-4447-85c5-aa63d52ab9be\") " pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" Oct 28 00:15:55.781189 kubelet[2983]: I1028 00:15:55.780455 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b2716210-391e-4394-893c-61a7addd4a59-calico-apiserver-certs\") pod \"calico-apiserver-ddf7d975-c6hwj\" (UID: \"b2716210-391e-4394-893c-61a7addd4a59\") " pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" Oct 28 00:15:55.781189 kubelet[2983]: I1028 00:15:55.780465 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgwz6\" (UniqueName: \"kubernetes.io/projected/b2716210-391e-4394-893c-61a7addd4a59-kube-api-access-jgwz6\") pod \"calico-apiserver-ddf7d975-c6hwj\" (UID: \"b2716210-391e-4394-893c-61a7addd4a59\") " pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" Oct 28 00:15:55.781286 kubelet[2983]: I1028 00:15:55.780475 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8vl7\" (UniqueName: \"kubernetes.io/projected/1bfca3fa-cc6e-4c33-891f-c87e1750ccb0-kube-api-access-t8vl7\") pod \"coredns-66bc5c9577-c4hff\" (UID: \"1bfca3fa-cc6e-4c33-891f-c87e1750ccb0\") " pod="kube-system/coredns-66bc5c9577-c4hff" Oct 28 00:15:55.781286 kubelet[2983]: I1028 00:15:55.780485 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96fb014d-9989-4ee7-ac0f-1812a6692826-config-volume\") pod \"coredns-66bc5c9577-6ljtx\" (UID: \"96fb014d-9989-4ee7-ac0f-1812a6692826\") " pod="kube-system/coredns-66bc5c9577-6ljtx" Oct 28 00:15:55.781286 kubelet[2983]: I1028 00:15:55.780521 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vnvq\" (UniqueName: \"kubernetes.io/projected/c5b1e8a1-8c67-475a-a1aa-b28128f9865a-kube-api-access-7vnvq\") pod \"calico-apiserver-ddf7d975-rx56r\" (UID: \"c5b1e8a1-8c67-475a-a1aa-b28128f9865a\") " pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" Oct 28 00:15:55.781286 kubelet[2983]: I1028 00:15:55.780557 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndk85\" (UniqueName: \"kubernetes.io/projected/2e94a5f4-db66-4447-85c5-aa63d52ab9be-kube-api-access-ndk85\") pod \"calico-kube-controllers-6f9599dc7b-jdzs7\" (UID: \"2e94a5f4-db66-4447-85c5-aa63d52ab9be\") " pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" Oct 28 00:15:55.781286 kubelet[2983]: I1028 00:15:55.780593 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-ca-bundle\") pod \"whisker-6f4499d5c-4dzhh\" (UID: \"d333e87f-e9e2-4229-9f47-dc31abe7437e\") " pod="calico-system/whisker-6f4499d5c-4dzhh" Oct 28 00:15:55.781370 kubelet[2983]: I1028 00:15:55.780603 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f000d983-3d67-44b2-b245-9b82c5b15b84-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-525qc\" (UID: \"f000d983-3d67-44b2-b245-9b82c5b15b84\") " pod="calico-system/goldmane-7c778bb748-525qc" Oct 28 00:15:55.781370 kubelet[2983]: I1028 00:15:55.780612 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvs2\" (UniqueName: \"kubernetes.io/projected/f000d983-3d67-44b2-b245-9b82c5b15b84-kube-api-access-5cvs2\") pod \"goldmane-7c778bb748-525qc\" (UID: \"f000d983-3d67-44b2-b245-9b82c5b15b84\") " pod="calico-system/goldmane-7c778bb748-525qc" Oct 28 00:15:55.781370 kubelet[2983]: I1028 00:15:55.780621 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bfca3fa-cc6e-4c33-891f-c87e1750ccb0-config-volume\") pod \"coredns-66bc5c9577-c4hff\" (UID: \"1bfca3fa-cc6e-4c33-891f-c87e1750ccb0\") " pod="kube-system/coredns-66bc5c9577-c4hff" Oct 28 00:15:55.781370 kubelet[2983]: I1028 00:15:55.780642 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dqf6\" (UniqueName: \"kubernetes.io/projected/96fb014d-9989-4ee7-ac0f-1812a6692826-kube-api-access-5dqf6\") pod \"coredns-66bc5c9577-6ljtx\" (UID: \"96fb014d-9989-4ee7-ac0f-1812a6692826\") " pod="kube-system/coredns-66bc5c9577-6ljtx" Oct 28 00:15:55.781370 kubelet[2983]: I1028 00:15:55.780651 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c5b1e8a1-8c67-475a-a1aa-b28128f9865a-calico-apiserver-certs\") pod \"calico-apiserver-ddf7d975-rx56r\" (UID: \"c5b1e8a1-8c67-475a-a1aa-b28128f9865a\") " pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" Oct 28 00:15:55.957131 containerd[1690]: time="2025-10-28T00:15:55.957003751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f9599dc7b-jdzs7,Uid:2e94a5f4-db66-4447-85c5-aa63d52ab9be,Namespace:calico-system,Attempt:0,}" Oct 28 00:15:55.978877 containerd[1690]: time="2025-10-28T00:15:55.978668398Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c4hff,Uid:1bfca3fa-cc6e-4c33-891f-c87e1750ccb0,Namespace:kube-system,Attempt:0,}" Oct 28 00:15:55.997930 containerd[1690]: time="2025-10-28T00:15:55.997910042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6ljtx,Uid:96fb014d-9989-4ee7-ac0f-1812a6692826,Namespace:kube-system,Attempt:0,}" Oct 28 00:15:56.029349 containerd[1690]: time="2025-10-28T00:15:56.029312012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-525qc,Uid:f000d983-3d67-44b2-b245-9b82c5b15b84,Namespace:calico-system,Attempt:0,}" Oct 28 00:15:56.032752 containerd[1690]: time="2025-10-28T00:15:56.032729827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-c6hwj,Uid:b2716210-391e-4394-893c-61a7addd4a59,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:15:56.041441 containerd[1690]: time="2025-10-28T00:15:56.041272462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-rx56r,Uid:c5b1e8a1-8c67-475a-a1aa-b28128f9865a,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:15:56.041710 containerd[1690]: time="2025-10-28T00:15:56.041657538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f4499d5c-4dzhh,Uid:d333e87f-e9e2-4229-9f47-dc31abe7437e,Namespace:calico-system,Attempt:0,}" Oct 28 00:15:56.257382 containerd[1690]: time="2025-10-28T00:15:56.257272442Z" level=error msg="Failed to destroy network for sandbox \"e962a2d71202cd6318005a1caff7c311d1957e5b31de07efd922f16cb5ac5b06\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.257984 containerd[1690]: time="2025-10-28T00:15:56.257964662Z" level=error msg="Failed to destroy network for sandbox \"e1b8bae330d6956dcc75ca5962be0f834bb5e78a770a9f65711f3f84cfc8941b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.258668 containerd[1690]: time="2025-10-28T00:15:56.258558005Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-rx56r,Uid:c5b1e8a1-8c67-475a-a1aa-b28128f9865a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e962a2d71202cd6318005a1caff7c311d1957e5b31de07efd922f16cb5ac5b06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.265622 containerd[1690]: time="2025-10-28T00:15:56.265588977Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-525qc,Uid:f000d983-3d67-44b2-b245-9b82c5b15b84,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1b8bae330d6956dcc75ca5962be0f834bb5e78a770a9f65711f3f84cfc8941b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.266375 kubelet[2983]: E1028 00:15:56.266346 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1b8bae330d6956dcc75ca5962be0f834bb5e78a770a9f65711f3f84cfc8941b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.266451 kubelet[2983]: E1028 00:15:56.266397 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e962a2d71202cd6318005a1caff7c311d1957e5b31de07efd922f16cb5ac5b06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.266451 kubelet[2983]: E1028 00:15:56.266439 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1b8bae330d6956dcc75ca5962be0f834bb5e78a770a9f65711f3f84cfc8941b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-525qc" Oct 28 00:15:56.266593 kubelet[2983]: E1028 00:15:56.266452 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1b8bae330d6956dcc75ca5962be0f834bb5e78a770a9f65711f3f84cfc8941b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-525qc" Oct 28 00:15:56.266593 kubelet[2983]: E1028 00:15:56.266521 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e962a2d71202cd6318005a1caff7c311d1957e5b31de07efd922f16cb5ac5b06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" Oct 28 00:15:56.266593 kubelet[2983]: E1028 00:15:56.266532 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e962a2d71202cd6318005a1caff7c311d1957e5b31de07efd922f16cb5ac5b06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" Oct 28 00:15:56.266653 kubelet[2983]: E1028 00:15:56.266550 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-ddf7d975-rx56r_calico-apiserver(c5b1e8a1-8c67-475a-a1aa-b28128f9865a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-ddf7d975-rx56r_calico-apiserver(c5b1e8a1-8c67-475a-a1aa-b28128f9865a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e962a2d71202cd6318005a1caff7c311d1957e5b31de07efd922f16cb5ac5b06\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:15:56.266653 kubelet[2983]: E1028 00:15:56.266486 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-525qc_calico-system(f000d983-3d67-44b2-b245-9b82c5b15b84)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-525qc_calico-system(f000d983-3d67-44b2-b245-9b82c5b15b84)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e1b8bae330d6956dcc75ca5962be0f834bb5e78a770a9f65711f3f84cfc8941b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:15:56.279875 containerd[1690]: time="2025-10-28T00:15:56.279840573Z" level=error msg="Failed to destroy network for sandbox \"c30ec70d997d0a0486c5a3aff89b358a6a32a1ec5da8a3e56360c71c73f76fc4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.281512 containerd[1690]: time="2025-10-28T00:15:56.281471253Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6ljtx,Uid:96fb014d-9989-4ee7-ac0f-1812a6692826,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c30ec70d997d0a0486c5a3aff89b358a6a32a1ec5da8a3e56360c71c73f76fc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.281832 kubelet[2983]: E1028 00:15:56.281633 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c30ec70d997d0a0486c5a3aff89b358a6a32a1ec5da8a3e56360c71c73f76fc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.281832 kubelet[2983]: E1028 00:15:56.281665 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c30ec70d997d0a0486c5a3aff89b358a6a32a1ec5da8a3e56360c71c73f76fc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6ljtx" Oct 28 00:15:56.281832 kubelet[2983]: E1028 00:15:56.281678 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c30ec70d997d0a0486c5a3aff89b358a6a32a1ec5da8a3e56360c71c73f76fc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6ljtx" Oct 28 00:15:56.282610 kubelet[2983]: E1028 00:15:56.281710 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-6ljtx_kube-system(96fb014d-9989-4ee7-ac0f-1812a6692826)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-6ljtx_kube-system(96fb014d-9989-4ee7-ac0f-1812a6692826)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c30ec70d997d0a0486c5a3aff89b358a6a32a1ec5da8a3e56360c71c73f76fc4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-6ljtx" podUID="96fb014d-9989-4ee7-ac0f-1812a6692826" Oct 28 00:15:56.285047 containerd[1690]: time="2025-10-28T00:15:56.284985996Z" level=error msg="Failed to destroy network for sandbox \"b1abd606e13beb783b9d27cc8a359b4f1eab6e4708ce0228f211ee1147bdf3c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.287294 containerd[1690]: time="2025-10-28T00:15:56.287271971Z" level=error msg="Failed to destroy network for sandbox \"cd1250a5d9ad9819fde82048bc408743c0c294843d3df1b49856eca94e4e0700\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.287409 containerd[1690]: time="2025-10-28T00:15:56.287391640Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c4hff,Uid:1bfca3fa-cc6e-4c33-891f-c87e1750ccb0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1abd606e13beb783b9d27cc8a359b4f1eab6e4708ce0228f211ee1147bdf3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.287644 kubelet[2983]: E1028 00:15:56.287623 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1abd606e13beb783b9d27cc8a359b4f1eab6e4708ce0228f211ee1147bdf3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.287680 kubelet[2983]: E1028 00:15:56.287654 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1abd606e13beb783b9d27cc8a359b4f1eab6e4708ce0228f211ee1147bdf3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-c4hff" Oct 28 00:15:56.287680 kubelet[2983]: E1028 00:15:56.287666 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1abd606e13beb783b9d27cc8a359b4f1eab6e4708ce0228f211ee1147bdf3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-c4hff" Oct 28 00:15:56.287726 kubelet[2983]: E1028 00:15:56.287697 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-c4hff_kube-system(1bfca3fa-cc6e-4c33-891f-c87e1750ccb0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-c4hff_kube-system(1bfca3fa-cc6e-4c33-891f-c87e1750ccb0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1abd606e13beb783b9d27cc8a359b4f1eab6e4708ce0228f211ee1147bdf3c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-c4hff" podUID="1bfca3fa-cc6e-4c33-891f-c87e1750ccb0" Oct 28 00:15:56.287985 containerd[1690]: time="2025-10-28T00:15:56.287961416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-c6hwj,Uid:b2716210-391e-4394-893c-61a7addd4a59,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd1250a5d9ad9819fde82048bc408743c0c294843d3df1b49856eca94e4e0700\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.288569 kubelet[2983]: E1028 00:15:56.288071 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd1250a5d9ad9819fde82048bc408743c0c294843d3df1b49856eca94e4e0700\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.288569 kubelet[2983]: E1028 00:15:56.288088 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd1250a5d9ad9819fde82048bc408743c0c294843d3df1b49856eca94e4e0700\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" Oct 28 00:15:56.288569 kubelet[2983]: E1028 00:15:56.288099 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd1250a5d9ad9819fde82048bc408743c0c294843d3df1b49856eca94e4e0700\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" Oct 28 00:15:56.288643 kubelet[2983]: E1028 00:15:56.288144 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-ddf7d975-c6hwj_calico-apiserver(b2716210-391e-4394-893c-61a7addd4a59)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-ddf7d975-c6hwj_calico-apiserver(b2716210-391e-4394-893c-61a7addd4a59)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cd1250a5d9ad9819fde82048bc408743c0c294843d3df1b49856eca94e4e0700\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:15:56.290778 containerd[1690]: time="2025-10-28T00:15:56.290663644Z" level=error msg="Failed to destroy network for sandbox \"2432382558504371ac51189b215c4fb59159bcfa6f9ba1783656c7040ee257fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.291266 containerd[1690]: time="2025-10-28T00:15:56.291141866Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f4499d5c-4dzhh,Uid:d333e87f-e9e2-4229-9f47-dc31abe7437e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2432382558504371ac51189b215c4fb59159bcfa6f9ba1783656c7040ee257fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.291368 kubelet[2983]: E1028 00:15:56.291347 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2432382558504371ac51189b215c4fb59159bcfa6f9ba1783656c7040ee257fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.292058 kubelet[2983]: E1028 00:15:56.291478 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2432382558504371ac51189b215c4fb59159bcfa6f9ba1783656c7040ee257fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f4499d5c-4dzhh" Oct 28 00:15:56.292058 kubelet[2983]: E1028 00:15:56.291529 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2432382558504371ac51189b215c4fb59159bcfa6f9ba1783656c7040ee257fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f4499d5c-4dzhh" Oct 28 00:15:56.292058 kubelet[2983]: E1028 00:15:56.291662 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6f4499d5c-4dzhh_calico-system(d333e87f-e9e2-4229-9f47-dc31abe7437e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6f4499d5c-4dzhh_calico-system(d333e87f-e9e2-4229-9f47-dc31abe7437e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2432382558504371ac51189b215c4fb59159bcfa6f9ba1783656c7040ee257fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6f4499d5c-4dzhh" podUID="d333e87f-e9e2-4229-9f47-dc31abe7437e" Oct 28 00:15:56.296055 containerd[1690]: time="2025-10-28T00:15:56.296027578Z" level=error msg="Failed to destroy network for sandbox \"32d9d8c0ea8eeb42858af739d66a36b6412370e1c68deccba8cafdd9cc9d9c79\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.296602 containerd[1690]: time="2025-10-28T00:15:56.296570784Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f9599dc7b-jdzs7,Uid:2e94a5f4-db66-4447-85c5-aa63d52ab9be,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"32d9d8c0ea8eeb42858af739d66a36b6412370e1c68deccba8cafdd9cc9d9c79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.296796 kubelet[2983]: E1028 00:15:56.296771 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32d9d8c0ea8eeb42858af739d66a36b6412370e1c68deccba8cafdd9cc9d9c79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.296844 kubelet[2983]: E1028 00:15:56.296804 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32d9d8c0ea8eeb42858af739d66a36b6412370e1c68deccba8cafdd9cc9d9c79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" Oct 28 00:15:56.296844 kubelet[2983]: E1028 00:15:56.296819 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32d9d8c0ea8eeb42858af739d66a36b6412370e1c68deccba8cafdd9cc9d9c79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" Oct 28 00:15:56.296888 kubelet[2983]: E1028 00:15:56.296866 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6f9599dc7b-jdzs7_calico-system(2e94a5f4-db66-4447-85c5-aa63d52ab9be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6f9599dc7b-jdzs7_calico-system(2e94a5f4-db66-4447-85c5-aa63d52ab9be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"32d9d8c0ea8eeb42858af739d66a36b6412370e1c68deccba8cafdd9cc9d9c79\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:15:56.371403 systemd[1]: Created slice kubepods-besteffort-podf0632c4d_dd7d_4536_90e2_d70b340d8f15.slice - libcontainer container kubepods-besteffort-podf0632c4d_dd7d_4536_90e2_d70b340d8f15.slice. Oct 28 00:15:56.373463 containerd[1690]: time="2025-10-28T00:15:56.373441686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g789,Uid:f0632c4d-dd7d-4536-90e2-d70b340d8f15,Namespace:calico-system,Attempt:0,}" Oct 28 00:15:56.405847 containerd[1690]: time="2025-10-28T00:15:56.405811038Z" level=error msg="Failed to destroy network for sandbox \"362442252978fb4ca9feb8aa34ba59a4cda27dc1d0ec19d61efb15fca3effec3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.406295 containerd[1690]: time="2025-10-28T00:15:56.406247110Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g789,Uid:f0632c4d-dd7d-4536-90e2-d70b340d8f15,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"362442252978fb4ca9feb8aa34ba59a4cda27dc1d0ec19d61efb15fca3effec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.406480 kubelet[2983]: E1028 00:15:56.406455 2983 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"362442252978fb4ca9feb8aa34ba59a4cda27dc1d0ec19d61efb15fca3effec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 28 00:15:56.406531 kubelet[2983]: E1028 00:15:56.406505 2983 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"362442252978fb4ca9feb8aa34ba59a4cda27dc1d0ec19d61efb15fca3effec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7g789" Oct 28 00:15:56.406531 kubelet[2983]: E1028 00:15:56.406521 2983 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"362442252978fb4ca9feb8aa34ba59a4cda27dc1d0ec19d61efb15fca3effec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7g789" Oct 28 00:15:56.406590 kubelet[2983]: E1028 00:15:56.406567 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"362442252978fb4ca9feb8aa34ba59a4cda27dc1d0ec19d61efb15fca3effec3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:15:56.531685 containerd[1690]: time="2025-10-28T00:15:56.531596066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Oct 28 00:16:03.703175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2908127652.mount: Deactivated successfully. Oct 28 00:16:03.797549 containerd[1690]: time="2025-10-28T00:16:03.788798805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:16:03.799529 containerd[1690]: time="2025-10-28T00:16:03.799488505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Oct 28 00:16:03.799814 containerd[1690]: time="2025-10-28T00:16:03.799798871Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:16:03.800129 containerd[1690]: time="2025-10-28T00:16:03.800116887Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 28 00:16:03.800705 containerd[1690]: time="2025-10-28T00:16:03.800683723Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.269054722s" Oct 28 00:16:03.800741 containerd[1690]: time="2025-10-28T00:16:03.800709127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Oct 28 00:16:03.850550 containerd[1690]: time="2025-10-28T00:16:03.850520459Z" level=info msg="CreateContainer within sandbox \"c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Oct 28 00:16:03.867880 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount338100857.mount: Deactivated successfully. Oct 28 00:16:03.868368 containerd[1690]: time="2025-10-28T00:16:03.868347678Z" level=info msg="Container 33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:16:03.883313 containerd[1690]: time="2025-10-28T00:16:03.883243020Z" level=info msg="CreateContainer within sandbox \"c803e9db3715d563d3943f333c3351f2eb740c7630a0d4f9cd655e980746c2dd\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01\"" Oct 28 00:16:03.883950 containerd[1690]: time="2025-10-28T00:16:03.883711674Z" level=info msg="StartContainer for \"33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01\"" Oct 28 00:16:03.888403 containerd[1690]: time="2025-10-28T00:16:03.888380216Z" level=info msg="connecting to shim 33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01" address="unix:///run/containerd/s/d1272acb1da80ac89c16ef0a2b5cadeaa6febb5711398ad3ecf9ea4951fcc1af" protocol=ttrpc version=3 Oct 28 00:16:03.972781 systemd[1]: Started cri-containerd-33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01.scope - libcontainer container 33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01. Oct 28 00:16:04.042451 containerd[1690]: time="2025-10-28T00:16:04.042428621Z" level=info msg="StartContainer for \"33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01\" returns successfully" Oct 28 00:16:04.360010 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Oct 28 00:16:04.362694 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Oct 28 00:16:04.919106 containerd[1690]: time="2025-10-28T00:16:04.919078283Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01\" id:\"bb6b90b44b36378f940a165d55e64db15614d432f254b1a4ba4bc7733bae80bc\" pid:4045 exit_status:1 exited_at:{seconds:1761610564 nanos:918860818}" Oct 28 00:16:05.047024 kubelet[2983]: I1028 00:16:05.046958 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mjh2g" podStartSLOduration=2.420515852 podStartE2EDuration="23.046119134s" podCreationTimestamp="2025-10-28 00:15:42 +0000 UTC" firstStartedPulling="2025-10-28 00:15:43.175721145 +0000 UTC m=+20.930273483" lastFinishedPulling="2025-10-28 00:16:03.801324426 +0000 UTC m=+41.555876765" observedRunningTime="2025-10-28 00:16:04.589363735 +0000 UTC m=+42.343916076" watchObservedRunningTime="2025-10-28 00:16:05.046119134 +0000 UTC m=+42.800671480" Oct 28 00:16:05.148652 kubelet[2983]: I1028 00:16:05.148446 2983 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-ca-bundle\") pod \"d333e87f-e9e2-4229-9f47-dc31abe7437e\" (UID: \"d333e87f-e9e2-4229-9f47-dc31abe7437e\") " Oct 28 00:16:05.148652 kubelet[2983]: I1028 00:16:05.148477 2983 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbsdx\" (UniqueName: \"kubernetes.io/projected/d333e87f-e9e2-4229-9f47-dc31abe7437e-kube-api-access-lbsdx\") pod \"d333e87f-e9e2-4229-9f47-dc31abe7437e\" (UID: \"d333e87f-e9e2-4229-9f47-dc31abe7437e\") " Oct 28 00:16:05.148928 kubelet[2983]: I1028 00:16:05.148800 2983 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-backend-key-pair\") pod \"d333e87f-e9e2-4229-9f47-dc31abe7437e\" (UID: \"d333e87f-e9e2-4229-9f47-dc31abe7437e\") " Oct 28 00:16:05.161258 kubelet[2983]: I1028 00:16:05.161213 2983 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "d333e87f-e9e2-4229-9f47-dc31abe7437e" (UID: "d333e87f-e9e2-4229-9f47-dc31abe7437e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Oct 28 00:16:05.172248 systemd[1]: var-lib-kubelet-pods-d333e87f\x2de9e2\x2d4229\x2d9f47\x2ddc31abe7437e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dlbsdx.mount: Deactivated successfully. Oct 28 00:16:05.173777 kubelet[2983]: I1028 00:16:05.173733 2983 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "d333e87f-e9e2-4229-9f47-dc31abe7437e" (UID: "d333e87f-e9e2-4229-9f47-dc31abe7437e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Oct 28 00:16:05.175584 kubelet[2983]: I1028 00:16:05.175551 2983 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d333e87f-e9e2-4229-9f47-dc31abe7437e-kube-api-access-lbsdx" (OuterVolumeSpecName: "kube-api-access-lbsdx") pod "d333e87f-e9e2-4229-9f47-dc31abe7437e" (UID: "d333e87f-e9e2-4229-9f47-dc31abe7437e"). InnerVolumeSpecName "kube-api-access-lbsdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Oct 28 00:16:05.176290 systemd[1]: var-lib-kubelet-pods-d333e87f\x2de9e2\x2d4229\x2d9f47\x2ddc31abe7437e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Oct 28 00:16:05.249446 kubelet[2983]: I1028 00:16:05.249422 2983 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Oct 28 00:16:05.249446 kubelet[2983]: I1028 00:16:05.249442 2983 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbsdx\" (UniqueName: \"kubernetes.io/projected/d333e87f-e9e2-4229-9f47-dc31abe7437e-kube-api-access-lbsdx\") on node \"localhost\" DevicePath \"\"" Oct 28 00:16:05.249446 kubelet[2983]: I1028 00:16:05.249447 2983 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d333e87f-e9e2-4229-9f47-dc31abe7437e-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Oct 28 00:16:05.573430 systemd[1]: Removed slice kubepods-besteffort-podd333e87f_e9e2_4229_9f47_dc31abe7437e.slice - libcontainer container kubepods-besteffort-podd333e87f_e9e2_4229_9f47_dc31abe7437e.slice. Oct 28 00:16:05.679004 containerd[1690]: time="2025-10-28T00:16:05.678967750Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01\" id:\"18759d70ead4b8d837c40d9741e960717c3be839a100f9e95a6cf44b6ca9082a\" pid:4077 exit_status:1 exited_at:{seconds:1761610565 nanos:678744985}" Oct 28 00:16:05.687173 systemd[1]: Created slice kubepods-besteffort-pod65ba14cb_b505_45aa_b041_c988e49efa4a.slice - libcontainer container kubepods-besteffort-pod65ba14cb_b505_45aa_b041_c988e49efa4a.slice. Oct 28 00:16:05.752555 kubelet[2983]: I1028 00:16:05.752433 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx82r\" (UniqueName: \"kubernetes.io/projected/65ba14cb-b505-45aa-b041-c988e49efa4a-kube-api-access-fx82r\") pod \"whisker-58bff6789c-fght4\" (UID: \"65ba14cb-b505-45aa-b041-c988e49efa4a\") " pod="calico-system/whisker-58bff6789c-fght4" Oct 28 00:16:05.752555 kubelet[2983]: I1028 00:16:05.752470 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65ba14cb-b505-45aa-b041-c988e49efa4a-whisker-ca-bundle\") pod \"whisker-58bff6789c-fght4\" (UID: \"65ba14cb-b505-45aa-b041-c988e49efa4a\") " pod="calico-system/whisker-58bff6789c-fght4" Oct 28 00:16:05.752555 kubelet[2983]: I1028 00:16:05.752484 2983 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/65ba14cb-b505-45aa-b041-c988e49efa4a-whisker-backend-key-pair\") pod \"whisker-58bff6789c-fght4\" (UID: \"65ba14cb-b505-45aa-b041-c988e49efa4a\") " pod="calico-system/whisker-58bff6789c-fght4" Oct 28 00:16:06.001188 containerd[1690]: time="2025-10-28T00:16:06.001107172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58bff6789c-fght4,Uid:65ba14cb-b505-45aa-b041-c988e49efa4a,Namespace:calico-system,Attempt:0,}" Oct 28 00:16:06.388949 kubelet[2983]: I1028 00:16:06.388925 2983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d333e87f-e9e2-4229-9f47-dc31abe7437e" path="/var/lib/kubelet/pods/d333e87f-e9e2-4229-9f47-dc31abe7437e/volumes" Oct 28 00:16:06.530929 systemd-networkd[1481]: vxlan.calico: Link UP Oct 28 00:16:06.530936 systemd-networkd[1481]: vxlan.calico: Gained carrier Oct 28 00:16:06.894561 systemd-networkd[1481]: cali05814044086: Link UP Oct 28 00:16:06.895553 systemd-networkd[1481]: cali05814044086: Gained carrier Oct 28 00:16:06.912222 containerd[1690]: 2025-10-28 00:16:06.085 [INFO][4178] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 28 00:16:06.912222 containerd[1690]: 2025-10-28 00:16:06.246 [INFO][4178] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--58bff6789c--fght4-eth0 whisker-58bff6789c- calico-system 65ba14cb-b505-45aa-b041-c988e49efa4a 922 0 2025-10-28 00:16:05 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:58bff6789c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-58bff6789c-fght4 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali05814044086 [] [] }} ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-" Oct 28 00:16:06.912222 containerd[1690]: 2025-10-28 00:16:06.246 [INFO][4178] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-eth0" Oct 28 00:16:06.912222 containerd[1690]: 2025-10-28 00:16:06.793 [INFO][4205] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" HandleID="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Workload="localhost-k8s-whisker--58bff6789c--fght4-eth0" Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.820 [INFO][4205] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" HandleID="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Workload="localhost-k8s-whisker--58bff6789c--fght4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000362190), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-58bff6789c-fght4", "timestamp":"2025-10-28 00:16:06.793505885 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.820 [INFO][4205] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.826 [INFO][4205] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.837 [INFO][4205] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.854 [INFO][4205] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" host="localhost" Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.871 [INFO][4205] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.877 [INFO][4205] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.879 [INFO][4205] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.881 [INFO][4205] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:06.912725 containerd[1690]: 2025-10-28 00:16:06.881 [INFO][4205] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" host="localhost" Oct 28 00:16:06.913882 containerd[1690]: 2025-10-28 00:16:06.881 [INFO][4205] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f Oct 28 00:16:06.913882 containerd[1690]: 2025-10-28 00:16:06.884 [INFO][4205] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" host="localhost" Oct 28 00:16:06.913882 containerd[1690]: 2025-10-28 00:16:06.887 [INFO][4205] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" host="localhost" Oct 28 00:16:06.913882 containerd[1690]: 2025-10-28 00:16:06.887 [INFO][4205] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" host="localhost" Oct 28 00:16:06.913882 containerd[1690]: 2025-10-28 00:16:06.887 [INFO][4205] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:06.913882 containerd[1690]: 2025-10-28 00:16:06.887 [INFO][4205] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" HandleID="k8s-pod-network.57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Workload="localhost-k8s-whisker--58bff6789c--fght4-eth0" Oct 28 00:16:06.914015 containerd[1690]: 2025-10-28 00:16:06.889 [INFO][4178] cni-plugin/k8s.go 418: Populated endpoint ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--58bff6789c--fght4-eth0", GenerateName:"whisker-58bff6789c-", Namespace:"calico-system", SelfLink:"", UID:"65ba14cb-b505-45aa-b041-c988e49efa4a", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 16, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58bff6789c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-58bff6789c-fght4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali05814044086", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:06.914015 containerd[1690]: 2025-10-28 00:16:06.889 [INFO][4178] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-eth0" Oct 28 00:16:06.914088 containerd[1690]: 2025-10-28 00:16:06.890 [INFO][4178] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali05814044086 ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-eth0" Oct 28 00:16:06.914088 containerd[1690]: 2025-10-28 00:16:06.896 [INFO][4178] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-eth0" Oct 28 00:16:06.914477 containerd[1690]: 2025-10-28 00:16:06.897 [INFO][4178] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--58bff6789c--fght4-eth0", GenerateName:"whisker-58bff6789c-", Namespace:"calico-system", SelfLink:"", UID:"65ba14cb-b505-45aa-b041-c988e49efa4a", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 16, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58bff6789c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f", Pod:"whisker-58bff6789c-fght4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali05814044086", MAC:"8e:fd:7e:3b:e0:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:06.915352 containerd[1690]: 2025-10-28 00:16:06.908 [INFO][4178] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" Namespace="calico-system" Pod="whisker-58bff6789c-fght4" WorkloadEndpoint="localhost-k8s-whisker--58bff6789c--fght4-eth0" Oct 28 00:16:07.068929 containerd[1690]: time="2025-10-28T00:16:07.068875492Z" level=info msg="connecting to shim 57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f" address="unix:///run/containerd/s/94bdb786897fdda70c5c2b092c723bc84ffcadaccad78872b1f585289da627da" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:07.086646 systemd[1]: Started cri-containerd-57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f.scope - libcontainer container 57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f. Oct 28 00:16:07.095235 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:07.135339 containerd[1690]: time="2025-10-28T00:16:07.135311870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58bff6789c-fght4,Uid:65ba14cb-b505-45aa-b041-c988e49efa4a,Namespace:calico-system,Attempt:0,} returns sandbox id \"57ecb45d8c4cde09af876356c8f54d7e8d8f7a7724f8e4d95d1b5b3984525d2f\"" Oct 28 00:16:07.136329 containerd[1690]: time="2025-10-28T00:16:07.136273459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 28 00:16:07.505955 containerd[1690]: time="2025-10-28T00:16:07.505824369Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:07.506721 containerd[1690]: time="2025-10-28T00:16:07.506704850Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 28 00:16:07.506815 containerd[1690]: time="2025-10-28T00:16:07.506787840Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 28 00:16:07.506982 kubelet[2983]: E1028 00:16:07.506955 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:16:07.507168 kubelet[2983]: E1028 00:16:07.506986 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:16:07.507168 kubelet[2983]: E1028 00:16:07.507048 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-58bff6789c-fght4_calico-system(65ba14cb-b505-45aa-b041-c988e49efa4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:07.508378 containerd[1690]: time="2025-10-28T00:16:07.508348351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 28 00:16:07.842488 containerd[1690]: time="2025-10-28T00:16:07.842382427Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:07.846008 containerd[1690]: time="2025-10-28T00:16:07.845974923Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 28 00:16:07.846123 containerd[1690]: time="2025-10-28T00:16:07.846041311Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 28 00:16:07.846195 kubelet[2983]: E1028 00:16:07.846154 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:16:07.846195 kubelet[2983]: E1028 00:16:07.846193 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:16:07.846317 kubelet[2983]: E1028 00:16:07.846248 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-58bff6789c-fght4_calico-system(65ba14cb-b505-45aa-b041-c988e49efa4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:07.846317 kubelet[2983]: E1028 00:16:07.846292 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:16:08.065611 systemd-networkd[1481]: vxlan.calico: Gained IPv6LL Oct 28 00:16:08.380639 containerd[1690]: time="2025-10-28T00:16:08.380604231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-rx56r,Uid:c5b1e8a1-8c67-475a-a1aa-b28128f9865a,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:16:08.387975 containerd[1690]: time="2025-10-28T00:16:08.387877365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f9599dc7b-jdzs7,Uid:2e94a5f4-db66-4447-85c5-aa63d52ab9be,Namespace:calico-system,Attempt:0,}" Oct 28 00:16:08.510923 systemd-networkd[1481]: cali49944e76a97: Link UP Oct 28 00:16:08.511611 systemd-networkd[1481]: cali49944e76a97: Gained carrier Oct 28 00:16:08.525053 containerd[1690]: 2025-10-28 00:16:08.430 [INFO][4366] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0 calico-kube-controllers-6f9599dc7b- calico-system 2e94a5f4-db66-4447-85c5-aa63d52ab9be 841 0 2025-10-28 00:15:43 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6f9599dc7b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6f9599dc7b-jdzs7 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali49944e76a97 [] [] }} ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-" Oct 28 00:16:08.525053 containerd[1690]: 2025-10-28 00:16:08.431 [INFO][4366] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" Oct 28 00:16:08.525053 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4389] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" HandleID="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Workload="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4389] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" HandleID="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Workload="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332ad0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6f9599dc7b-jdzs7", "timestamp":"2025-10-28 00:16:08.46929803 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4389] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4389] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4389] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.475 [INFO][4389] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" host="localhost" Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.478 [INFO][4389] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.481 [INFO][4389] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.483 [INFO][4389] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.485 [INFO][4389] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:08.525419 containerd[1690]: 2025-10-28 00:16:08.485 [INFO][4389] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" host="localhost" Oct 28 00:16:08.527431 containerd[1690]: 2025-10-28 00:16:08.488 [INFO][4389] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61 Oct 28 00:16:08.527431 containerd[1690]: 2025-10-28 00:16:08.499 [INFO][4389] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" host="localhost" Oct 28 00:16:08.527431 containerd[1690]: 2025-10-28 00:16:08.503 [INFO][4389] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" host="localhost" Oct 28 00:16:08.527431 containerd[1690]: 2025-10-28 00:16:08.504 [INFO][4389] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" host="localhost" Oct 28 00:16:08.527431 containerd[1690]: 2025-10-28 00:16:08.504 [INFO][4389] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:08.527431 containerd[1690]: 2025-10-28 00:16:08.504 [INFO][4389] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" HandleID="k8s-pod-network.31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Workload="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" Oct 28 00:16:08.527750 containerd[1690]: 2025-10-28 00:16:08.508 [INFO][4366] cni-plugin/k8s.go 418: Populated endpoint ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0", GenerateName:"calico-kube-controllers-6f9599dc7b-", Namespace:"calico-system", SelfLink:"", UID:"2e94a5f4-db66-4447-85c5-aa63d52ab9be", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f9599dc7b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6f9599dc7b-jdzs7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali49944e76a97", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:08.527807 containerd[1690]: 2025-10-28 00:16:08.508 [INFO][4366] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" Oct 28 00:16:08.527807 containerd[1690]: 2025-10-28 00:16:08.508 [INFO][4366] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali49944e76a97 ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" Oct 28 00:16:08.527807 containerd[1690]: 2025-10-28 00:16:08.510 [INFO][4366] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" Oct 28 00:16:08.527869 containerd[1690]: 2025-10-28 00:16:08.510 [INFO][4366] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0", GenerateName:"calico-kube-controllers-6f9599dc7b-", Namespace:"calico-system", SelfLink:"", UID:"2e94a5f4-db66-4447-85c5-aa63d52ab9be", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f9599dc7b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61", Pod:"calico-kube-controllers-6f9599dc7b-jdzs7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali49944e76a97", MAC:"32:d1:1a:8a:3e:1f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:08.527931 containerd[1690]: 2025-10-28 00:16:08.522 [INFO][4366] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" Namespace="calico-system" Pod="calico-kube-controllers-6f9599dc7b-jdzs7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6f9599dc7b--jdzs7-eth0" Oct 28 00:16:08.549819 containerd[1690]: time="2025-10-28T00:16:08.549743783Z" level=info msg="connecting to shim 31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61" address="unix:///run/containerd/s/2e4417431b2bd6a824362c40483617f5df6b2dfb0f2514cc9a63bc32ac25c78e" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:08.572632 systemd[1]: Started cri-containerd-31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61.scope - libcontainer container 31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61. Oct 28 00:16:08.577661 systemd-networkd[1481]: cali05814044086: Gained IPv6LL Oct 28 00:16:08.579305 kubelet[2983]: E1028 00:16:08.579267 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:16:08.585285 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:08.623485 containerd[1690]: time="2025-10-28T00:16:08.623443533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f9599dc7b-jdzs7,Uid:2e94a5f4-db66-4447-85c5-aa63d52ab9be,Namespace:calico-system,Attempt:0,} returns sandbox id \"31bcfee77ba9a55e0bf793e3356a6783892e8f2fe1a4e79afd96cc26e1528f61\"" Oct 28 00:16:08.624986 containerd[1690]: time="2025-10-28T00:16:08.624974674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 28 00:16:08.715462 systemd-networkd[1481]: calie36b051683f: Link UP Oct 28 00:16:08.716791 systemd-networkd[1481]: calie36b051683f: Gained carrier Oct 28 00:16:08.748642 containerd[1690]: 2025-10-28 00:16:08.433 [INFO][4365] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0 calico-apiserver-ddf7d975- calico-apiserver c5b1e8a1-8c67-475a-a1aa-b28128f9865a 851 0 2025-10-28 00:15:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:ddf7d975 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-ddf7d975-rx56r eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie36b051683f [] [] }} ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-" Oct 28 00:16:08.748642 containerd[1690]: 2025-10-28 00:16:08.433 [INFO][4365] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" Oct 28 00:16:08.748642 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4394] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" HandleID="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Workload="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4394] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" HandleID="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Workload="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4eb0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-ddf7d975-rx56r", "timestamp":"2025-10-28 00:16:08.469616579 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.469 [INFO][4394] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.504 [INFO][4394] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.504 [INFO][4394] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.601 [INFO][4394] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" host="localhost" Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.623 [INFO][4394] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.652 [INFO][4394] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.658 [INFO][4394] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.678 [INFO][4394] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:08.749394 containerd[1690]: 2025-10-28 00:16:08.678 [INFO][4394] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" host="localhost" Oct 28 00:16:08.749750 containerd[1690]: 2025-10-28 00:16:08.680 [INFO][4394] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2 Oct 28 00:16:08.749750 containerd[1690]: 2025-10-28 00:16:08.691 [INFO][4394] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" host="localhost" Oct 28 00:16:08.749750 containerd[1690]: 2025-10-28 00:16:08.711 [INFO][4394] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" host="localhost" Oct 28 00:16:08.749750 containerd[1690]: 2025-10-28 00:16:08.711 [INFO][4394] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" host="localhost" Oct 28 00:16:08.749750 containerd[1690]: 2025-10-28 00:16:08.711 [INFO][4394] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:08.749750 containerd[1690]: 2025-10-28 00:16:08.711 [INFO][4394] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" HandleID="k8s-pod-network.38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Workload="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" Oct 28 00:16:08.750126 containerd[1690]: 2025-10-28 00:16:08.713 [INFO][4365] cni-plugin/k8s.go 418: Populated endpoint ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0", GenerateName:"calico-apiserver-ddf7d975-", Namespace:"calico-apiserver", SelfLink:"", UID:"c5b1e8a1-8c67-475a-a1aa-b28128f9865a", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ddf7d975", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-ddf7d975-rx56r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie36b051683f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:08.750232 containerd[1690]: 2025-10-28 00:16:08.713 [INFO][4365] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" Oct 28 00:16:08.750232 containerd[1690]: 2025-10-28 00:16:08.713 [INFO][4365] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie36b051683f ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" Oct 28 00:16:08.750232 containerd[1690]: 2025-10-28 00:16:08.715 [INFO][4365] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" Oct 28 00:16:08.750334 containerd[1690]: 2025-10-28 00:16:08.717 [INFO][4365] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0", GenerateName:"calico-apiserver-ddf7d975-", Namespace:"calico-apiserver", SelfLink:"", UID:"c5b1e8a1-8c67-475a-a1aa-b28128f9865a", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ddf7d975", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2", Pod:"calico-apiserver-ddf7d975-rx56r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie36b051683f", MAC:"46:10:80:94:8c:5a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:08.750631 containerd[1690]: 2025-10-28 00:16:08.745 [INFO][4365] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-rx56r" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--rx56r-eth0" Oct 28 00:16:08.771092 containerd[1690]: time="2025-10-28T00:16:08.771053468Z" level=info msg="connecting to shim 38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2" address="unix:///run/containerd/s/dfcc740ad3d68353b5310ae8253706116c9b7c5dd3e06f602ef204b50337bc8d" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:08.792653 systemd[1]: Started cri-containerd-38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2.scope - libcontainer container 38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2. Oct 28 00:16:08.806080 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:08.842157 containerd[1690]: time="2025-10-28T00:16:08.842131280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-rx56r,Uid:c5b1e8a1-8c67-475a-a1aa-b28128f9865a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"38636c3f2b51d74d6c2db2e7864727c6848d9af1c25ae6c9cb8f38dfe56fcbe2\"" Oct 28 00:16:08.978818 containerd[1690]: time="2025-10-28T00:16:08.978727625Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:08.998858 containerd[1690]: time="2025-10-28T00:16:08.998828184Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 28 00:16:08.998919 containerd[1690]: time="2025-10-28T00:16:08.998890919Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 28 00:16:08.999060 kubelet[2983]: E1028 00:16:08.999006 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:16:08.999109 kubelet[2983]: E1028 00:16:08.999063 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:16:08.999181 kubelet[2983]: E1028 00:16:08.999157 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6f9599dc7b-jdzs7_calico-system(2e94a5f4-db66-4447-85c5-aa63d52ab9be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:08.999243 kubelet[2983]: E1028 00:16:08.999190 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:16:09.000115 containerd[1690]: time="2025-10-28T00:16:08.999923309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:16:09.342311 containerd[1690]: time="2025-10-28T00:16:09.342170804Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:09.350185 containerd[1690]: time="2025-10-28T00:16:09.350165656Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:16:09.350281 containerd[1690]: time="2025-10-28T00:16:09.350204601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:09.350741 kubelet[2983]: E1028 00:16:09.350424 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:09.350741 kubelet[2983]: E1028 00:16:09.350449 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:09.350741 kubelet[2983]: E1028 00:16:09.350519 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-ddf7d975-rx56r_calico-apiserver(c5b1e8a1-8c67-475a-a1aa-b28128f9865a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:09.350741 kubelet[2983]: E1028 00:16:09.350543 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:16:09.377446 containerd[1690]: time="2025-10-28T00:16:09.377248499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6ljtx,Uid:96fb014d-9989-4ee7-ac0f-1812a6692826,Namespace:kube-system,Attempt:0,}" Oct 28 00:16:09.498125 systemd-networkd[1481]: calie1abe524f5c: Link UP Oct 28 00:16:09.499191 systemd-networkd[1481]: calie1abe524f5c: Gained carrier Oct 28 00:16:09.518318 containerd[1690]: 2025-10-28 00:16:09.426 [INFO][4513] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--6ljtx-eth0 coredns-66bc5c9577- kube-system 96fb014d-9989-4ee7-ac0f-1812a6692826 849 0 2025-10-28 00:15:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-6ljtx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie1abe524f5c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-" Oct 28 00:16:09.518318 containerd[1690]: 2025-10-28 00:16:09.426 [INFO][4513] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" Oct 28 00:16:09.518318 containerd[1690]: 2025-10-28 00:16:09.459 [INFO][4524] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" HandleID="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Workload="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.459 [INFO][4524] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" HandleID="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Workload="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-6ljtx", "timestamp":"2025-10-28 00:16:09.459845373 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.460 [INFO][4524] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.460 [INFO][4524] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.460 [INFO][4524] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.466 [INFO][4524] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" host="localhost" Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.470 [INFO][4524] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.474 [INFO][4524] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.476 [INFO][4524] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.478 [INFO][4524] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:09.518647 containerd[1690]: 2025-10-28 00:16:09.478 [INFO][4524] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" host="localhost" Oct 28 00:16:09.530590 containerd[1690]: 2025-10-28 00:16:09.479 [INFO][4524] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d Oct 28 00:16:09.530590 containerd[1690]: 2025-10-28 00:16:09.484 [INFO][4524] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" host="localhost" Oct 28 00:16:09.530590 containerd[1690]: 2025-10-28 00:16:09.489 [INFO][4524] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" host="localhost" Oct 28 00:16:09.530590 containerd[1690]: 2025-10-28 00:16:09.489 [INFO][4524] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" host="localhost" Oct 28 00:16:09.530590 containerd[1690]: 2025-10-28 00:16:09.489 [INFO][4524] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:09.530590 containerd[1690]: 2025-10-28 00:16:09.489 [INFO][4524] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" HandleID="k8s-pod-network.edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Workload="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" Oct 28 00:16:09.530702 containerd[1690]: 2025-10-28 00:16:09.491 [INFO][4513] cni-plugin/k8s.go 418: Populated endpoint ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--6ljtx-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"96fb014d-9989-4ee7-ac0f-1812a6692826", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-6ljtx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie1abe524f5c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:09.530702 containerd[1690]: 2025-10-28 00:16:09.491 [INFO][4513] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" Oct 28 00:16:09.530702 containerd[1690]: 2025-10-28 00:16:09.491 [INFO][4513] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie1abe524f5c ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" Oct 28 00:16:09.530702 containerd[1690]: 2025-10-28 00:16:09.499 [INFO][4513] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" Oct 28 00:16:09.530702 containerd[1690]: 2025-10-28 00:16:09.501 [INFO][4513] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--6ljtx-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"96fb014d-9989-4ee7-ac0f-1812a6692826", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d", Pod:"coredns-66bc5c9577-6ljtx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie1abe524f5c", MAC:"26:b4:e6:50:fc:d7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:09.530702 containerd[1690]: 2025-10-28 00:16:09.516 [INFO][4513] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" Namespace="kube-system" Pod="coredns-66bc5c9577-6ljtx" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--6ljtx-eth0" Oct 28 00:16:09.582753 kubelet[2983]: E1028 00:16:09.581562 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:16:09.583360 kubelet[2983]: E1028 00:16:09.583311 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:16:09.622373 containerd[1690]: time="2025-10-28T00:16:09.622042032Z" level=info msg="connecting to shim edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d" address="unix:///run/containerd/s/2914eac1f90c06e083368551216dcc2b6c7e008bc892cadf1f7bbc091b0ea382" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:09.646615 systemd[1]: Started cri-containerd-edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d.scope - libcontainer container edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d. Oct 28 00:16:09.658243 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:09.697448 containerd[1690]: time="2025-10-28T00:16:09.697417796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6ljtx,Uid:96fb014d-9989-4ee7-ac0f-1812a6692826,Namespace:kube-system,Attempt:0,} returns sandbox id \"edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d\"" Oct 28 00:16:09.732063 containerd[1690]: time="2025-10-28T00:16:09.732037539Z" level=info msg="CreateContainer within sandbox \"edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 28 00:16:09.745472 containerd[1690]: time="2025-10-28T00:16:09.744999552Z" level=info msg="Container edc97f26089f84110b680add51490cbc183e4281c82d09b4c92cad6350d83262: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:16:09.748041 containerd[1690]: time="2025-10-28T00:16:09.748025871Z" level=info msg="CreateContainer within sandbox \"edbdbcb871ce0f9bb85fe8360e29711e8a1481ec1dbf5684e0c0bb3e46b30b3d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"edc97f26089f84110b680add51490cbc183e4281c82d09b4c92cad6350d83262\"" Oct 28 00:16:09.749616 containerd[1690]: time="2025-10-28T00:16:09.749591566Z" level=info msg="StartContainer for \"edc97f26089f84110b680add51490cbc183e4281c82d09b4c92cad6350d83262\"" Oct 28 00:16:09.750162 containerd[1690]: time="2025-10-28T00:16:09.750135858Z" level=info msg="connecting to shim edc97f26089f84110b680add51490cbc183e4281c82d09b4c92cad6350d83262" address="unix:///run/containerd/s/2914eac1f90c06e083368551216dcc2b6c7e008bc892cadf1f7bbc091b0ea382" protocol=ttrpc version=3 Oct 28 00:16:09.766629 systemd[1]: Started cri-containerd-edc97f26089f84110b680add51490cbc183e4281c82d09b4c92cad6350d83262.scope - libcontainer container edc97f26089f84110b680add51490cbc183e4281c82d09b4c92cad6350d83262. Oct 28 00:16:09.860612 containerd[1690]: time="2025-10-28T00:16:09.860590534Z" level=info msg="StartContainer for \"edc97f26089f84110b680add51490cbc183e4281c82d09b4c92cad6350d83262\" returns successfully" Oct 28 00:16:10.177769 systemd-networkd[1481]: calie36b051683f: Gained IPv6LL Oct 28 00:16:10.387854 containerd[1690]: time="2025-10-28T00:16:10.387827048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-525qc,Uid:f000d983-3d67-44b2-b245-9b82c5b15b84,Namespace:calico-system,Attempt:0,}" Oct 28 00:16:10.400483 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3505447046.mount: Deactivated successfully. Oct 28 00:16:10.497605 systemd-networkd[1481]: cali49944e76a97: Gained IPv6LL Oct 28 00:16:10.640463 systemd-networkd[1481]: cali3fa26eb54cd: Link UP Oct 28 00:16:10.641456 systemd-networkd[1481]: cali3fa26eb54cd: Gained carrier Oct 28 00:16:10.653174 kubelet[2983]: E1028 00:16:10.652842 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:16:10.653638 kubelet[2983]: E1028 00:16:10.653617 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.565 [INFO][4618] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7c778bb748--525qc-eth0 goldmane-7c778bb748- calico-system f000d983-3d67-44b2-b245-9b82c5b15b84 850 0 2025-10-28 00:15:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7c778bb748-525qc eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali3fa26eb54cd [] [] }} ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.565 [INFO][4618] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-eth0" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.587 [INFO][4631] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" HandleID="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Workload="localhost-k8s-goldmane--7c778bb748--525qc-eth0" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.587 [INFO][4631] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" HandleID="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Workload="localhost-k8s-goldmane--7c778bb748--525qc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f1d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7c778bb748-525qc", "timestamp":"2025-10-28 00:16:10.587521742 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.587 [INFO][4631] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.588 [INFO][4631] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.588 [INFO][4631] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.594 [INFO][4631] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.604 [INFO][4631] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.608 [INFO][4631] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.610 [INFO][4631] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.611 [INFO][4631] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.611 [INFO][4631] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.612 [INFO][4631] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63 Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.619 [INFO][4631] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.636 [INFO][4631] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.636 [INFO][4631] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" host="localhost" Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.636 [INFO][4631] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:10.683552 containerd[1690]: 2025-10-28 00:16:10.636 [INFO][4631] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" HandleID="k8s-pod-network.2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Workload="localhost-k8s-goldmane--7c778bb748--525qc-eth0" Oct 28 00:16:10.722587 containerd[1690]: 2025-10-28 00:16:10.638 [INFO][4618] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7c778bb748--525qc-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"f000d983-3d67-44b2-b245-9b82c5b15b84", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7c778bb748-525qc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3fa26eb54cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:10.722587 containerd[1690]: 2025-10-28 00:16:10.638 [INFO][4618] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-eth0" Oct 28 00:16:10.722587 containerd[1690]: 2025-10-28 00:16:10.638 [INFO][4618] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3fa26eb54cd ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-eth0" Oct 28 00:16:10.722587 containerd[1690]: 2025-10-28 00:16:10.641 [INFO][4618] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-eth0" Oct 28 00:16:10.722587 containerd[1690]: 2025-10-28 00:16:10.642 [INFO][4618] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7c778bb748--525qc-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"f000d983-3d67-44b2-b245-9b82c5b15b84", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63", Pod:"goldmane-7c778bb748-525qc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3fa26eb54cd", MAC:"fe:e4:57:42:1c:be", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:10.722587 containerd[1690]: 2025-10-28 00:16:10.681 [INFO][4618] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" Namespace="calico-system" Pod="goldmane-7c778bb748-525qc" WorkloadEndpoint="localhost-k8s-goldmane--7c778bb748--525qc-eth0" Oct 28 00:16:10.842094 kubelet[2983]: I1028 00:16:10.782883 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-6ljtx" podStartSLOduration=41.728254369 podStartE2EDuration="41.728254369s" podCreationTimestamp="2025-10-28 00:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:16:10.699314995 +0000 UTC m=+48.453867352" watchObservedRunningTime="2025-10-28 00:16:10.728254369 +0000 UTC m=+48.482806721" Oct 28 00:16:10.851227 containerd[1690]: time="2025-10-28T00:16:10.851134044Z" level=info msg="connecting to shim 2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63" address="unix:///run/containerd/s/c64783b0f81e294a0c589e1cfa36b1bb694ba30d5cdc1beeef9aa4f1a6e6f194" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:10.898656 systemd[1]: Started cri-containerd-2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63.scope - libcontainer container 2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63. Oct 28 00:16:10.918213 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:10.974941 containerd[1690]: time="2025-10-28T00:16:10.974914577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-525qc,Uid:f000d983-3d67-44b2-b245-9b82c5b15b84,Namespace:calico-system,Attempt:0,} returns sandbox id \"2504372635848a93cdc3f6e5d62220dd8134fd4490900542d6d62d011247ae63\"" Oct 28 00:16:10.987833 containerd[1690]: time="2025-10-28T00:16:10.987805196Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 28 00:16:11.137619 systemd-networkd[1481]: calie1abe524f5c: Gained IPv6LL Oct 28 00:16:11.372043 containerd[1690]: time="2025-10-28T00:16:11.372008749Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:11.388822 containerd[1690]: time="2025-10-28T00:16:11.388767275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c4hff,Uid:1bfca3fa-cc6e-4c33-891f-c87e1750ccb0,Namespace:kube-system,Attempt:0,}" Oct 28 00:16:11.408408 containerd[1690]: time="2025-10-28T00:16:11.408381042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-c6hwj,Uid:b2716210-391e-4394-893c-61a7addd4a59,Namespace:calico-apiserver,Attempt:0,}" Oct 28 00:16:11.426108 containerd[1690]: time="2025-10-28T00:16:11.426067415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g789,Uid:f0632c4d-dd7d-4536-90e2-d70b340d8f15,Namespace:calico-system,Attempt:0,}" Oct 28 00:16:12.490048 containerd[1690]: time="2025-10-28T00:16:12.489997000Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 28 00:16:12.494063 containerd[1690]: time="2025-10-28T00:16:12.492968347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:12.494258 kubelet[2983]: E1028 00:16:12.494226 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:16:12.502382 kubelet[2983]: E1028 00:16:12.501130 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:16:12.502382 kubelet[2983]: E1028 00:16:12.501286 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-525qc_calico-system(f000d983-3d67-44b2-b245-9b82c5b15b84): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:12.502382 kubelet[2983]: E1028 00:16:12.501314 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:16:12.609772 systemd-networkd[1481]: cali3fa26eb54cd: Gained IPv6LL Oct 28 00:16:12.652287 systemd-networkd[1481]: cali18c56577c03: Link UP Oct 28 00:16:12.654136 systemd-networkd[1481]: cali18c56577c03: Gained carrier Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.570 [INFO][4703] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--c4hff-eth0 coredns-66bc5c9577- kube-system 1bfca3fa-cc6e-4c33-891f-c87e1750ccb0 848 0 2025-10-28 00:15:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-c4hff eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali18c56577c03 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.570 [INFO][4703] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.604 [INFO][4746] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" HandleID="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Workload="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.604 [INFO][4746] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" HandleID="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Workload="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cddc0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-c4hff", "timestamp":"2025-10-28 00:16:12.604023944 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.604 [INFO][4746] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.604 [INFO][4746] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.604 [INFO][4746] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.614 [INFO][4746] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.617 [INFO][4746] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.621 [INFO][4746] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.623 [INFO][4746] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.625 [INFO][4746] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.625 [INFO][4746] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.626 [INFO][4746] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54 Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.630 [INFO][4746] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.635 [INFO][4746] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.636 [INFO][4746] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" host="localhost" Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.636 [INFO][4746] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:12.670069 containerd[1690]: 2025-10-28 00:16:12.636 [INFO][4746] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" HandleID="k8s-pod-network.1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Workload="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" Oct 28 00:16:12.689562 containerd[1690]: 2025-10-28 00:16:12.644 [INFO][4703] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--c4hff-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"1bfca3fa-cc6e-4c33-891f-c87e1750ccb0", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-c4hff", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali18c56577c03", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:12.689562 containerd[1690]: 2025-10-28 00:16:12.644 [INFO][4703] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" Oct 28 00:16:12.689562 containerd[1690]: 2025-10-28 00:16:12.644 [INFO][4703] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali18c56577c03 ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" Oct 28 00:16:12.689562 containerd[1690]: 2025-10-28 00:16:12.655 [INFO][4703] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" Oct 28 00:16:12.689562 containerd[1690]: 2025-10-28 00:16:12.655 [INFO][4703] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--c4hff-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"1bfca3fa-cc6e-4c33-891f-c87e1750ccb0", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54", Pod:"coredns-66bc5c9577-c4hff", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali18c56577c03", MAC:"46:e2:df:73:81:88", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:12.689562 containerd[1690]: 2025-10-28 00:16:12.664 [INFO][4703] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" Namespace="kube-system" Pod="coredns-66bc5c9577-c4hff" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--c4hff-eth0" Oct 28 00:16:12.689787 kubelet[2983]: E1028 00:16:12.679976 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:16:12.822972 containerd[1690]: time="2025-10-28T00:16:12.822126618Z" level=info msg="connecting to shim 1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54" address="unix:///run/containerd/s/a0a489c11646abf5adc7c2aacda56d24fd8336d35e8661c70a3058c1ba1eb6fd" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:12.843609 systemd[1]: Started cri-containerd-1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54.scope - libcontainer container 1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54. Oct 28 00:16:12.853167 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:12.899696 containerd[1690]: time="2025-10-28T00:16:12.899669399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-c4hff,Uid:1bfca3fa-cc6e-4c33-891f-c87e1750ccb0,Namespace:kube-system,Attempt:0,} returns sandbox id \"1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54\"" Oct 28 00:16:12.917634 containerd[1690]: time="2025-10-28T00:16:12.917615099Z" level=info msg="CreateContainer within sandbox \"1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 28 00:16:12.949966 systemd-networkd[1481]: cali9108d4740c7: Link UP Oct 28 00:16:12.950713 systemd-networkd[1481]: cali9108d4740c7: Gained carrier Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.563 [INFO][4702] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--7g789-eth0 csi-node-driver- calico-system f0632c4d-dd7d-4536-90e2-d70b340d8f15 731 0 2025-10-28 00:15:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-7g789 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9108d4740c7 [] [] }} ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.563 [INFO][4702] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-eth0" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.605 [INFO][4744] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" HandleID="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Workload="localhost-k8s-csi--node--driver--7g789-eth0" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.605 [INFO][4744] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" HandleID="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Workload="localhost-k8s-csi--node--driver--7g789-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-7g789", "timestamp":"2025-10-28 00:16:12.605293306 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.605 [INFO][4744] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.636 [INFO][4744] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.636 [INFO][4744] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.715 [INFO][4744] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.751 [INFO][4744] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.798 [INFO][4744] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.814 [INFO][4744] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.862 [INFO][4744] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.862 [INFO][4744] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.863 [INFO][4744] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9 Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.874 [INFO][4744] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.925 [INFO][4744] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.926 [INFO][4744] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" host="localhost" Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.926 [INFO][4744] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:12.969759 containerd[1690]: 2025-10-28 00:16:12.926 [INFO][4744] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" HandleID="k8s-pod-network.8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Workload="localhost-k8s-csi--node--driver--7g789-eth0" Oct 28 00:16:12.983693 containerd[1690]: 2025-10-28 00:16:12.937 [INFO][4702] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7g789-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f0632c4d-dd7d-4536-90e2-d70b340d8f15", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-7g789", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9108d4740c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:12.983693 containerd[1690]: 2025-10-28 00:16:12.948 [INFO][4702] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-eth0" Oct 28 00:16:12.983693 containerd[1690]: 2025-10-28 00:16:12.948 [INFO][4702] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9108d4740c7 ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-eth0" Oct 28 00:16:12.983693 containerd[1690]: 2025-10-28 00:16:12.951 [INFO][4702] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-eth0" Oct 28 00:16:12.983693 containerd[1690]: 2025-10-28 00:16:12.951 [INFO][4702] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7g789-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f0632c4d-dd7d-4536-90e2-d70b340d8f15", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9", Pod:"csi-node-driver-7g789", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9108d4740c7", MAC:"92:36:98:2b:64:6e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:12.983693 containerd[1690]: 2025-10-28 00:16:12.967 [INFO][4702] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" Namespace="calico-system" Pod="csi-node-driver-7g789" WorkloadEndpoint="localhost-k8s-csi--node--driver--7g789-eth0" Oct 28 00:16:12.985509 containerd[1690]: time="2025-10-28T00:16:12.984568608Z" level=info msg="Container 43ff61f5f977109142546600b4e30e534e086c0e3e963187ee98b92ee25a8d7c: CDI devices from CRI Config.CDIDevices: []" Oct 28 00:16:12.996036 containerd[1690]: time="2025-10-28T00:16:12.995985447Z" level=info msg="CreateContainer within sandbox \"1ce0c22d338d65488defe4d0929294457bd0d2e0a9e49dec32378c8118d90d54\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"43ff61f5f977109142546600b4e30e534e086c0e3e963187ee98b92ee25a8d7c\"" Oct 28 00:16:12.997761 containerd[1690]: time="2025-10-28T00:16:12.997705250Z" level=info msg="StartContainer for \"43ff61f5f977109142546600b4e30e534e086c0e3e963187ee98b92ee25a8d7c\"" Oct 28 00:16:12.998768 containerd[1690]: time="2025-10-28T00:16:12.998748251Z" level=info msg="connecting to shim 43ff61f5f977109142546600b4e30e534e086c0e3e963187ee98b92ee25a8d7c" address="unix:///run/containerd/s/a0a489c11646abf5adc7c2aacda56d24fd8336d35e8661c70a3058c1ba1eb6fd" protocol=ttrpc version=3 Oct 28 00:16:13.031582 containerd[1690]: time="2025-10-28T00:16:13.031535084Z" level=info msg="connecting to shim 8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9" address="unix:///run/containerd/s/eaef0a15a61cef0ab63c2ac4e4cc039f18f23ea56903e3786cf11bc8e0fc547e" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:13.046987 systemd[1]: Started cri-containerd-43ff61f5f977109142546600b4e30e534e086c0e3e963187ee98b92ee25a8d7c.scope - libcontainer container 43ff61f5f977109142546600b4e30e534e086c0e3e963187ee98b92ee25a8d7c. Oct 28 00:16:13.069733 systemd-networkd[1481]: cali19b5b166854: Link UP Oct 28 00:16:13.074451 systemd-networkd[1481]: cali19b5b166854: Gained carrier Oct 28 00:16:13.077749 systemd[1]: Started cri-containerd-8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9.scope - libcontainer container 8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9. Oct 28 00:16:13.098486 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.568 [INFO][4724] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0 calico-apiserver-ddf7d975- calico-apiserver b2716210-391e-4394-893c-61a7addd4a59 845 0 2025-10-28 00:15:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:ddf7d975 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-ddf7d975-c6hwj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali19b5b166854 [] [] }} ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.568 [INFO][4724] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.609 [INFO][4742] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" HandleID="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Workload="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.610 [INFO][4742] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" HandleID="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Workload="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-ddf7d975-c6hwj", "timestamp":"2025-10-28 00:16:12.609475453 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.610 [INFO][4742] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.926 [INFO][4742] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.926 [INFO][4742] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.964 [INFO][4742] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.975 [INFO][4742] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.985 [INFO][4742] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:12.989 [INFO][4742] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.004 [INFO][4742] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.005 [INFO][4742] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.013 [INFO][4742] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239 Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.041 [INFO][4742] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.055 [INFO][4742] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.055 [INFO][4742] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" host="localhost" Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.055 [INFO][4742] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 28 00:16:13.118308 containerd[1690]: 2025-10-28 00:16:13.055 [INFO][4742] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" HandleID="k8s-pod-network.799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Workload="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" Oct 28 00:16:13.118780 containerd[1690]: 2025-10-28 00:16:13.061 [INFO][4724] cni-plugin/k8s.go 418: Populated endpoint ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0", GenerateName:"calico-apiserver-ddf7d975-", Namespace:"calico-apiserver", SelfLink:"", UID:"b2716210-391e-4394-893c-61a7addd4a59", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ddf7d975", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-ddf7d975-c6hwj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali19b5b166854", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:13.118780 containerd[1690]: 2025-10-28 00:16:13.062 [INFO][4724] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" Oct 28 00:16:13.118780 containerd[1690]: 2025-10-28 00:16:13.062 [INFO][4724] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali19b5b166854 ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" Oct 28 00:16:13.118780 containerd[1690]: 2025-10-28 00:16:13.072 [INFO][4724] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" Oct 28 00:16:13.118780 containerd[1690]: 2025-10-28 00:16:13.074 [INFO][4724] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0", GenerateName:"calico-apiserver-ddf7d975-", Namespace:"calico-apiserver", SelfLink:"", UID:"b2716210-391e-4394-893c-61a7addd4a59", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.October, 28, 0, 15, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ddf7d975", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239", Pod:"calico-apiserver-ddf7d975-c6hwj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali19b5b166854", MAC:"52:28:af:88:78:6b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 28 00:16:13.118780 containerd[1690]: 2025-10-28 00:16:13.112 [INFO][4724] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" Namespace="calico-apiserver" Pod="calico-apiserver-ddf7d975-c6hwj" WorkloadEndpoint="localhost-k8s-calico--apiserver--ddf7d975--c6hwj-eth0" Oct 28 00:16:13.131606 containerd[1690]: time="2025-10-28T00:16:13.131564428Z" level=info msg="StartContainer for \"43ff61f5f977109142546600b4e30e534e086c0e3e963187ee98b92ee25a8d7c\" returns successfully" Oct 28 00:16:13.155696 containerd[1690]: time="2025-10-28T00:16:13.155665359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7g789,Uid:f0632c4d-dd7d-4536-90e2-d70b340d8f15,Namespace:calico-system,Attempt:0,} returns sandbox id \"8e457788f1d2bddc8e3cb8dd7bc49906b03a1224fa34f367ddef241e64091fb9\"" Oct 28 00:16:13.158114 containerd[1690]: time="2025-10-28T00:16:13.157953416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 28 00:16:13.221170 containerd[1690]: time="2025-10-28T00:16:13.221121871Z" level=info msg="connecting to shim 799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239" address="unix:///run/containerd/s/af57ea2895f9a670a7333978e4721808f93509b5860ca5bebda72c9aa138bff9" namespace=k8s.io protocol=ttrpc version=3 Oct 28 00:16:13.250813 systemd[1]: Started cri-containerd-799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239.scope - libcontainer container 799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239. Oct 28 00:16:13.261628 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 28 00:16:13.313252 containerd[1690]: time="2025-10-28T00:16:13.313220802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ddf7d975-c6hwj,Uid:b2716210-391e-4394-893c-61a7addd4a59,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"799f4fe2c9a9967dbe2235e8b4517cc4ad4a677209c8ef6362ff84848dd7e239\"" Oct 28 00:16:13.587998 containerd[1690]: time="2025-10-28T00:16:13.587901241Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:13.588866 containerd[1690]: time="2025-10-28T00:16:13.588825666Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 28 00:16:13.589460 containerd[1690]: time="2025-10-28T00:16:13.589068247Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 28 00:16:13.589529 kubelet[2983]: E1028 00:16:13.589282 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:16:13.589529 kubelet[2983]: E1028 00:16:13.589315 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:16:13.590524 kubelet[2983]: E1028 00:16:13.589941 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:13.590734 containerd[1690]: time="2025-10-28T00:16:13.590702964Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:16:13.739398 kubelet[2983]: I1028 00:16:13.739359 2983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-c4hff" podStartSLOduration=44.739346518 podStartE2EDuration="44.739346518s" podCreationTimestamp="2025-10-28 00:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-28 00:16:13.716444001 +0000 UTC m=+51.470996346" watchObservedRunningTime="2025-10-28 00:16:13.739346518 +0000 UTC m=+51.493898867" Oct 28 00:16:13.936751 containerd[1690]: time="2025-10-28T00:16:13.936659220Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:13.943093 containerd[1690]: time="2025-10-28T00:16:13.943063203Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:16:13.943285 containerd[1690]: time="2025-10-28T00:16:13.943121807Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:13.943319 kubelet[2983]: E1028 00:16:13.943224 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:13.943319 kubelet[2983]: E1028 00:16:13.943252 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:13.943582 kubelet[2983]: E1028 00:16:13.943529 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-ddf7d975-c6hwj_calico-apiserver(b2716210-391e-4394-893c-61a7addd4a59): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:13.943582 kubelet[2983]: E1028 00:16:13.943558 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:16:13.943840 containerd[1690]: time="2025-10-28T00:16:13.943806135Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 28 00:16:14.209794 systemd-networkd[1481]: cali9108d4740c7: Gained IPv6LL Oct 28 00:16:14.283664 containerd[1690]: time="2025-10-28T00:16:14.283623365Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:14.294883 containerd[1690]: time="2025-10-28T00:16:14.294858565Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 28 00:16:14.294925 containerd[1690]: time="2025-10-28T00:16:14.294909470Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 28 00:16:14.295005 kubelet[2983]: E1028 00:16:14.294982 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:16:14.295060 kubelet[2983]: E1028 00:16:14.295009 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:16:14.295060 kubelet[2983]: E1028 00:16:14.295054 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:14.295223 kubelet[2983]: E1028 00:16:14.295091 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:16:14.593974 systemd-networkd[1481]: cali18c56577c03: Gained IPv6LL Oct 28 00:16:14.658761 kubelet[2983]: E1028 00:16:14.658640 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:16:14.659162 kubelet[2983]: E1028 00:16:14.659141 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:16:14.721866 systemd-networkd[1481]: cali19b5b166854: Gained IPv6LL Oct 28 00:16:22.655072 containerd[1690]: time="2025-10-28T00:16:22.654556865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 28 00:16:23.009365 containerd[1690]: time="2025-10-28T00:16:23.009146417Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:23.013675 containerd[1690]: time="2025-10-28T00:16:23.013624431Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 28 00:16:23.013734 containerd[1690]: time="2025-10-28T00:16:23.013704702Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 28 00:16:23.013903 kubelet[2983]: E1028 00:16:23.013830 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:16:23.013903 kubelet[2983]: E1028 00:16:23.013892 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:16:23.014178 kubelet[2983]: E1028 00:16:23.014072 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6f9599dc7b-jdzs7_calico-system(2e94a5f4-db66-4447-85c5-aa63d52ab9be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:23.014178 kubelet[2983]: E1028 00:16:23.014097 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:16:23.018187 containerd[1690]: time="2025-10-28T00:16:23.014572265Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 28 00:16:23.422916 containerd[1690]: time="2025-10-28T00:16:23.422802714Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:23.428892 containerd[1690]: time="2025-10-28T00:16:23.428827518Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 28 00:16:23.428892 containerd[1690]: time="2025-10-28T00:16:23.428879540Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 28 00:16:23.429003 kubelet[2983]: E1028 00:16:23.428959 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:16:23.429003 kubelet[2983]: E1028 00:16:23.428991 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:16:23.429066 kubelet[2983]: E1028 00:16:23.429038 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-58bff6789c-fght4_calico-system(65ba14cb-b505-45aa-b041-c988e49efa4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:23.429688 containerd[1690]: time="2025-10-28T00:16:23.429667401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 28 00:16:23.782840 containerd[1690]: time="2025-10-28T00:16:23.782749706Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:23.790025 containerd[1690]: time="2025-10-28T00:16:23.789991172Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 28 00:16:23.790117 containerd[1690]: time="2025-10-28T00:16:23.790057748Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 28 00:16:23.790222 kubelet[2983]: E1028 00:16:23.790192 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:16:23.790274 kubelet[2983]: E1028 00:16:23.790227 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:16:23.790342 kubelet[2983]: E1028 00:16:23.790321 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-58bff6789c-fght4_calico-system(65ba14cb-b505-45aa-b041-c988e49efa4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:23.790532 kubelet[2983]: E1028 00:16:23.790364 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:16:25.366135 containerd[1690]: time="2025-10-28T00:16:25.366099686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:16:25.730985 containerd[1690]: time="2025-10-28T00:16:25.730902681Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:25.746775 containerd[1690]: time="2025-10-28T00:16:25.746731187Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:16:25.746897 containerd[1690]: time="2025-10-28T00:16:25.746744005Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:25.747120 kubelet[2983]: E1028 00:16:25.747019 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:25.747120 kubelet[2983]: E1028 00:16:25.747064 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:25.747315 kubelet[2983]: E1028 00:16:25.747193 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-ddf7d975-rx56r_calico-apiserver(c5b1e8a1-8c67-475a-a1aa-b28128f9865a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:25.747315 kubelet[2983]: E1028 00:16:25.747218 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:16:25.747642 containerd[1690]: time="2025-10-28T00:16:25.747594163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 28 00:16:26.094802 containerd[1690]: time="2025-10-28T00:16:26.094755083Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:26.101393 containerd[1690]: time="2025-10-28T00:16:26.101203890Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 28 00:16:26.101393 containerd[1690]: time="2025-10-28T00:16:26.101282750Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:26.101550 kubelet[2983]: E1028 00:16:26.101414 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:16:26.101550 kubelet[2983]: E1028 00:16:26.101442 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:16:26.101550 kubelet[2983]: E1028 00:16:26.101508 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-525qc_calico-system(f000d983-3d67-44b2-b245-9b82c5b15b84): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:26.101550 kubelet[2983]: E1028 00:16:26.101532 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:16:26.367265 containerd[1690]: time="2025-10-28T00:16:26.367134326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:16:26.686760 containerd[1690]: time="2025-10-28T00:16:26.686679474Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:26.701909 containerd[1690]: time="2025-10-28T00:16:26.701879013Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:16:26.702144 containerd[1690]: time="2025-10-28T00:16:26.701921558Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:26.702192 kubelet[2983]: E1028 00:16:26.702000 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:26.702192 kubelet[2983]: E1028 00:16:26.702022 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:26.702192 kubelet[2983]: E1028 00:16:26.702089 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-ddf7d975-c6hwj_calico-apiserver(b2716210-391e-4394-893c-61a7addd4a59): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:26.702192 kubelet[2983]: E1028 00:16:26.702110 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:16:29.367367 containerd[1690]: time="2025-10-28T00:16:29.367182178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 28 00:16:29.744693 containerd[1690]: time="2025-10-28T00:16:29.744606731Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:29.745215 containerd[1690]: time="2025-10-28T00:16:29.745150614Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 28 00:16:29.745673 containerd[1690]: time="2025-10-28T00:16:29.745349964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 28 00:16:29.745728 kubelet[2983]: E1028 00:16:29.745375 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:16:29.745728 kubelet[2983]: E1028 00:16:29.745416 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:16:29.745728 kubelet[2983]: E1028 00:16:29.745486 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:29.746181 containerd[1690]: time="2025-10-28T00:16:29.746168142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 28 00:16:30.106608 containerd[1690]: time="2025-10-28T00:16:30.106581678Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:30.108455 containerd[1690]: time="2025-10-28T00:16:30.108432553Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 28 00:16:30.108561 containerd[1690]: time="2025-10-28T00:16:30.108514824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 28 00:16:30.108697 kubelet[2983]: E1028 00:16:30.108672 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:16:30.108735 kubelet[2983]: E1028 00:16:30.108705 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:16:30.108770 kubelet[2983]: E1028 00:16:30.108754 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:30.108825 kubelet[2983]: E1028 00:16:30.108782 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:16:34.366075 kubelet[2983]: E1028 00:16:34.365971 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:16:35.366663 kubelet[2983]: E1028 00:16:35.366610 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:16:35.661330 containerd[1690]: time="2025-10-28T00:16:35.661241204Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01\" id:\"07b244874c31d5cd879d861bf4e2717c7b97f31d6046f5f4d922eded1492cd1c\" pid:5007 exited_at:{seconds:1761610595 nanos:648326799}" Oct 28 00:16:38.367516 kubelet[2983]: E1028 00:16:38.367093 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:16:38.944266 systemd[1]: Started sshd@7-139.178.70.103:22-139.178.68.195:37142.service - OpenSSH per-connection server daemon (139.178.68.195:37142). Oct 28 00:16:39.049220 sshd[5024]: Accepted publickey for core from 139.178.68.195 port 37142 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:16:39.051591 sshd-session[5024]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:16:39.060527 systemd-logind[1652]: New session 10 of user core. Oct 28 00:16:39.066590 systemd[1]: Started session-10.scope - Session 10 of User core. Oct 28 00:16:39.367200 kubelet[2983]: E1028 00:16:39.367164 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:16:39.660719 sshd[5027]: Connection closed by 139.178.68.195 port 37142 Oct 28 00:16:39.661654 sshd-session[5024]: pam_unix(sshd:session): session closed for user core Oct 28 00:16:39.669721 systemd[1]: sshd@7-139.178.70.103:22-139.178.68.195:37142.service: Deactivated successfully. Oct 28 00:16:39.671368 systemd[1]: session-10.scope: Deactivated successfully. Oct 28 00:16:39.673323 systemd-logind[1652]: Session 10 logged out. Waiting for processes to exit. Oct 28 00:16:39.674284 systemd-logind[1652]: Removed session 10. Oct 28 00:16:41.377467 kubelet[2983]: E1028 00:16:41.377319 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:16:41.384325 kubelet[2983]: E1028 00:16:41.377880 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:16:44.672315 systemd[1]: Started sshd@8-139.178.70.103:22-139.178.68.195:54106.service - OpenSSH per-connection server daemon (139.178.68.195:54106). Oct 28 00:16:44.759080 sshd[5044]: Accepted publickey for core from 139.178.68.195 port 54106 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:16:44.759954 sshd-session[5044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:16:44.765632 systemd-logind[1652]: New session 11 of user core. Oct 28 00:16:44.770633 systemd[1]: Started session-11.scope - Session 11 of User core. Oct 28 00:16:44.922852 sshd[5049]: Connection closed by 139.178.68.195 port 54106 Oct 28 00:16:44.923042 sshd-session[5044]: pam_unix(sshd:session): session closed for user core Oct 28 00:16:44.929797 systemd[1]: sshd@8-139.178.70.103:22-139.178.68.195:54106.service: Deactivated successfully. Oct 28 00:16:44.930995 systemd[1]: session-11.scope: Deactivated successfully. Oct 28 00:16:44.933047 systemd-logind[1652]: Session 11 logged out. Waiting for processes to exit. Oct 28 00:16:44.933863 systemd-logind[1652]: Removed session 11. Oct 28 00:16:45.980711 systemd[1]: Started sshd@9-139.178.70.103:22-80.94.95.115:41536.service - OpenSSH per-connection server daemon (80.94.95.115:41536). Oct 28 00:16:46.367133 containerd[1690]: time="2025-10-28T00:16:46.367074492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 28 00:16:46.724467 containerd[1690]: time="2025-10-28T00:16:46.724356138Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:46.724890 containerd[1690]: time="2025-10-28T00:16:46.724683198Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 28 00:16:46.724890 containerd[1690]: time="2025-10-28T00:16:46.724738853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 28 00:16:46.725000 kubelet[2983]: E1028 00:16:46.724906 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:16:46.725000 kubelet[2983]: E1028 00:16:46.724954 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 28 00:16:46.725310 kubelet[2983]: E1028 00:16:46.725011 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6f9599dc7b-jdzs7_calico-system(2e94a5f4-db66-4447-85c5-aa63d52ab9be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:46.725310 kubelet[2983]: E1028 00:16:46.725033 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:16:48.130908 sshd[5062]: Connection closed by authenticating user root 80.94.95.115 port 41536 [preauth] Oct 28 00:16:48.132327 systemd[1]: sshd@9-139.178.70.103:22-80.94.95.115:41536.service: Deactivated successfully. Oct 28 00:16:48.367160 containerd[1690]: time="2025-10-28T00:16:48.366751009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 28 00:16:48.729046 containerd[1690]: time="2025-10-28T00:16:48.729007100Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:48.734722 containerd[1690]: time="2025-10-28T00:16:48.734668338Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 28 00:16:48.734839 containerd[1690]: time="2025-10-28T00:16:48.734777597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 28 00:16:48.734936 kubelet[2983]: E1028 00:16:48.734896 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:16:48.734936 kubelet[2983]: E1028 00:16:48.734937 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 28 00:16:48.735171 kubelet[2983]: E1028 00:16:48.735003 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-58bff6789c-fght4_calico-system(65ba14cb-b505-45aa-b041-c988e49efa4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:48.737219 containerd[1690]: time="2025-10-28T00:16:48.737037450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 28 00:16:49.178728 containerd[1690]: time="2025-10-28T00:16:49.178684620Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:49.182909 containerd[1690]: time="2025-10-28T00:16:49.182873207Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 28 00:16:49.191185 containerd[1690]: time="2025-10-28T00:16:49.182938712Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 28 00:16:49.191233 kubelet[2983]: E1028 00:16:49.183142 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:16:49.191233 kubelet[2983]: E1028 00:16:49.183173 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 28 00:16:49.191233 kubelet[2983]: E1028 00:16:49.183249 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-58bff6789c-fght4_calico-system(65ba14cb-b505-45aa-b041-c988e49efa4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:49.191336 kubelet[2983]: E1028 00:16:49.183291 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:16:49.932518 systemd[1]: Started sshd@10-139.178.70.103:22-139.178.68.195:54112.service - OpenSSH per-connection server daemon (139.178.68.195:54112). Oct 28 00:16:50.038467 sshd[5076]: Accepted publickey for core from 139.178.68.195 port 54112 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:16:50.039397 sshd-session[5076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:16:50.043323 systemd-logind[1652]: New session 12 of user core. Oct 28 00:16:50.050607 systemd[1]: Started session-12.scope - Session 12 of User core. Oct 28 00:16:50.253539 sshd[5079]: Connection closed by 139.178.68.195 port 54112 Oct 28 00:16:50.256273 systemd[1]: sshd@10-139.178.70.103:22-139.178.68.195:54112.service: Deactivated successfully. Oct 28 00:16:50.253998 sshd-session[5076]: pam_unix(sshd:session): session closed for user core Oct 28 00:16:50.257627 systemd[1]: session-12.scope: Deactivated successfully. Oct 28 00:16:50.258177 systemd-logind[1652]: Session 12 logged out. Waiting for processes to exit. Oct 28 00:16:50.258941 systemd-logind[1652]: Removed session 12. Oct 28 00:16:51.366584 containerd[1690]: time="2025-10-28T00:16:51.366457413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:16:51.724004 containerd[1690]: time="2025-10-28T00:16:51.723668351Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:51.727235 containerd[1690]: time="2025-10-28T00:16:51.727213824Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:16:51.732536 containerd[1690]: time="2025-10-28T00:16:51.727281737Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:51.732536 containerd[1690]: time="2025-10-28T00:16:51.727716798Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 28 00:16:51.732618 kubelet[2983]: E1028 00:16:51.727447 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:51.732618 kubelet[2983]: E1028 00:16:51.727475 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:51.732618 kubelet[2983]: E1028 00:16:51.727630 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-ddf7d975-c6hwj_calico-apiserver(b2716210-391e-4394-893c-61a7addd4a59): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:51.732618 kubelet[2983]: E1028 00:16:51.727662 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:16:52.055078 containerd[1690]: time="2025-10-28T00:16:52.054942776Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:52.063891 containerd[1690]: time="2025-10-28T00:16:52.063852397Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 28 00:16:52.064079 containerd[1690]: time="2025-10-28T00:16:52.063905694Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:52.065149 kubelet[2983]: E1028 00:16:52.065109 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:52.065235 kubelet[2983]: E1028 00:16:52.065155 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 28 00:16:52.065235 kubelet[2983]: E1028 00:16:52.065208 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-ddf7d975-rx56r_calico-apiserver(c5b1e8a1-8c67-475a-a1aa-b28128f9865a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:52.065235 kubelet[2983]: E1028 00:16:52.065229 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:16:53.369508 containerd[1690]: time="2025-10-28T00:16:53.367627713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 28 00:16:53.698701 containerd[1690]: time="2025-10-28T00:16:53.698551278Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:53.704200 containerd[1690]: time="2025-10-28T00:16:53.704136987Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 28 00:16:53.704263 containerd[1690]: time="2025-10-28T00:16:53.704198340Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 28 00:16:53.704286 kubelet[2983]: E1028 00:16:53.704270 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:16:53.704527 kubelet[2983]: E1028 00:16:53.704293 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 28 00:16:53.704527 kubelet[2983]: E1028 00:16:53.704339 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:53.716813 containerd[1690]: time="2025-10-28T00:16:53.705094345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 28 00:16:54.049205 containerd[1690]: time="2025-10-28T00:16:54.049176823Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:54.055066 containerd[1690]: time="2025-10-28T00:16:54.052414436Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 28 00:16:54.055066 containerd[1690]: time="2025-10-28T00:16:54.052463562Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 28 00:16:54.055122 kubelet[2983]: E1028 00:16:54.052563 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:16:54.055122 kubelet[2983]: E1028 00:16:54.052606 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 28 00:16:54.055122 kubelet[2983]: E1028 00:16:54.052655 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-7g789_calico-system(f0632c4d-dd7d-4536-90e2-d70b340d8f15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:54.055207 kubelet[2983]: E1028 00:16:54.052691 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:16:55.268705 systemd[1]: Started sshd@11-139.178.70.103:22-139.178.68.195:45706.service - OpenSSH per-connection server daemon (139.178.68.195:45706). Oct 28 00:16:55.309823 sshd[5094]: Accepted publickey for core from 139.178.68.195 port 45706 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:16:55.310620 sshd-session[5094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:16:55.314227 systemd-logind[1652]: New session 13 of user core. Oct 28 00:16:55.320612 systemd[1]: Started session-13.scope - Session 13 of User core. Oct 28 00:16:55.417525 sshd[5097]: Connection closed by 139.178.68.195 port 45706 Oct 28 00:16:55.417719 sshd-session[5094]: pam_unix(sshd:session): session closed for user core Oct 28 00:16:55.426802 systemd[1]: sshd@11-139.178.70.103:22-139.178.68.195:45706.service: Deactivated successfully. Oct 28 00:16:55.428597 systemd[1]: session-13.scope: Deactivated successfully. Oct 28 00:16:55.429236 systemd-logind[1652]: Session 13 logged out. Waiting for processes to exit. Oct 28 00:16:55.430276 systemd-logind[1652]: Removed session 13. Oct 28 00:16:55.433041 systemd[1]: Started sshd@12-139.178.70.103:22-139.178.68.195:45716.service - OpenSSH per-connection server daemon (139.178.68.195:45716). Oct 28 00:16:55.473934 sshd[5109]: Accepted publickey for core from 139.178.68.195 port 45716 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:16:55.474822 sshd-session[5109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:16:55.478357 systemd-logind[1652]: New session 14 of user core. Oct 28 00:16:55.483668 systemd[1]: Started session-14.scope - Session 14 of User core. Oct 28 00:16:55.760112 sshd[5112]: Connection closed by 139.178.68.195 port 45716 Oct 28 00:16:55.773698 systemd[1]: Started sshd@13-139.178.70.103:22-139.178.68.195:45720.service - OpenSSH per-connection server daemon (139.178.68.195:45720). Oct 28 00:16:55.778411 sshd-session[5109]: pam_unix(sshd:session): session closed for user core Oct 28 00:16:55.813283 systemd[1]: sshd@12-139.178.70.103:22-139.178.68.195:45716.service: Deactivated successfully. Oct 28 00:16:55.816387 systemd[1]: session-14.scope: Deactivated successfully. Oct 28 00:16:55.817539 systemd-logind[1652]: Session 14 logged out. Waiting for processes to exit. Oct 28 00:16:55.818334 systemd-logind[1652]: Removed session 14. Oct 28 00:16:56.103409 sshd[5119]: Accepted publickey for core from 139.178.68.195 port 45720 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:16:56.112474 sshd-session[5119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:16:56.116265 systemd-logind[1652]: New session 15 of user core. Oct 28 00:16:56.122868 systemd[1]: Started session-15.scope - Session 15 of User core. Oct 28 00:16:56.349114 sshd[5125]: Connection closed by 139.178.68.195 port 45720 Oct 28 00:16:56.349631 sshd-session[5119]: pam_unix(sshd:session): session closed for user core Oct 28 00:16:56.353038 systemd[1]: sshd@13-139.178.70.103:22-139.178.68.195:45720.service: Deactivated successfully. Oct 28 00:16:56.355081 systemd[1]: session-15.scope: Deactivated successfully. Oct 28 00:16:56.356406 systemd-logind[1652]: Session 15 logged out. Waiting for processes to exit. Oct 28 00:16:56.357445 systemd-logind[1652]: Removed session 15. Oct 28 00:16:56.373683 containerd[1690]: time="2025-10-28T00:16:56.373428370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 28 00:16:56.714413 containerd[1690]: time="2025-10-28T00:16:56.714317719Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 28 00:16:56.725505 containerd[1690]: time="2025-10-28T00:16:56.725452649Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 28 00:16:56.725611 containerd[1690]: time="2025-10-28T00:16:56.725586718Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 28 00:16:56.725776 kubelet[2983]: E1028 00:16:56.725739 2983 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:16:56.726138 kubelet[2983]: E1028 00:16:56.725779 2983 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 28 00:16:56.726138 kubelet[2983]: E1028 00:16:56.725844 2983 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-525qc_calico-system(f000d983-3d67-44b2-b245-9b82c5b15b84): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 28 00:16:56.726138 kubelet[2983]: E1028 00:16:56.725880 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:16:58.367767 kubelet[2983]: E1028 00:16:58.367139 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:17:01.361382 systemd[1]: Started sshd@14-139.178.70.103:22-139.178.68.195:45728.service - OpenSSH per-connection server daemon (139.178.68.195:45728). Oct 28 00:17:01.455909 sshd[5144]: Accepted publickey for core from 139.178.68.195 port 45728 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:01.456860 sshd-session[5144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:01.459472 systemd-logind[1652]: New session 16 of user core. Oct 28 00:17:01.471630 systemd[1]: Started session-16.scope - Session 16 of User core. Oct 28 00:17:01.584000 sshd[5147]: Connection closed by 139.178.68.195 port 45728 Oct 28 00:17:01.584476 sshd-session[5144]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:01.587350 systemd[1]: sshd@14-139.178.70.103:22-139.178.68.195:45728.service: Deactivated successfully. Oct 28 00:17:01.588317 systemd[1]: session-16.scope: Deactivated successfully. Oct 28 00:17:01.589258 systemd-logind[1652]: Session 16 logged out. Waiting for processes to exit. Oct 28 00:17:01.589878 systemd-logind[1652]: Removed session 16. Oct 28 00:17:03.366888 kubelet[2983]: E1028 00:17:03.366853 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:17:04.368980 kubelet[2983]: E1028 00:17:04.368922 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:17:05.736590 containerd[1690]: time="2025-10-28T00:17:05.736560077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33375ec4c4954d4eb452c0162cba220f78ebec8b01570879e91a8765ea9dba01\" id:\"7c0388bc378ac00af6fdc80438dbeb208d2c633bc48d3d472970795da390dea7\" pid:5172 exited_at:{seconds:1761610625 nanos:736189898}" Oct 28 00:17:06.366995 kubelet[2983]: E1028 00:17:06.366942 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:17:06.591576 systemd[1]: Started sshd@15-139.178.70.103:22-139.178.68.195:42430.service - OpenSSH per-connection server daemon (139.178.68.195:42430). Oct 28 00:17:06.643056 sshd[5184]: Accepted publickey for core from 139.178.68.195 port 42430 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:06.643604 sshd-session[5184]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:06.647057 systemd-logind[1652]: New session 17 of user core. Oct 28 00:17:06.651612 systemd[1]: Started session-17.scope - Session 17 of User core. Oct 28 00:17:06.767321 sshd[5187]: Connection closed by 139.178.68.195 port 42430 Oct 28 00:17:06.766831 sshd-session[5184]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:06.769053 systemd[1]: sshd@15-139.178.70.103:22-139.178.68.195:42430.service: Deactivated successfully. Oct 28 00:17:06.770427 systemd[1]: session-17.scope: Deactivated successfully. Oct 28 00:17:06.771431 systemd-logind[1652]: Session 17 logged out. Waiting for processes to exit. Oct 28 00:17:06.772537 systemd-logind[1652]: Removed session 17. Oct 28 00:17:07.366308 kubelet[2983]: E1028 00:17:07.366171 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:17:07.366767 kubelet[2983]: E1028 00:17:07.366750 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:17:09.365617 kubelet[2983]: E1028 00:17:09.365591 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:17:11.777620 systemd[1]: Started sshd@16-139.178.70.103:22-139.178.68.195:42432.service - OpenSSH per-connection server daemon (139.178.68.195:42432). Oct 28 00:17:11.815693 sshd[5201]: Accepted publickey for core from 139.178.68.195 port 42432 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:11.817020 sshd-session[5201]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:11.821090 systemd-logind[1652]: New session 18 of user core. Oct 28 00:17:11.826577 systemd[1]: Started session-18.scope - Session 18 of User core. Oct 28 00:17:11.982505 sshd[5204]: Connection closed by 139.178.68.195 port 42432 Oct 28 00:17:11.982663 sshd-session[5201]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:11.986431 systemd[1]: sshd@16-139.178.70.103:22-139.178.68.195:42432.service: Deactivated successfully. Oct 28 00:17:11.989258 systemd[1]: session-18.scope: Deactivated successfully. Oct 28 00:17:11.990677 systemd-logind[1652]: Session 18 logged out. Waiting for processes to exit. Oct 28 00:17:11.992698 systemd-logind[1652]: Removed session 18. Oct 28 00:17:14.367180 kubelet[2983]: E1028 00:17:14.366756 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:17:16.372516 kubelet[2983]: E1028 00:17:16.370046 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:17:16.994665 systemd[1]: Started sshd@17-139.178.70.103:22-139.178.68.195:37720.service - OpenSSH per-connection server daemon (139.178.68.195:37720). Oct 28 00:17:17.035900 sshd[5216]: Accepted publickey for core from 139.178.68.195 port 37720 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:17.037113 sshd-session[5216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:17.039963 systemd-logind[1652]: New session 19 of user core. Oct 28 00:17:17.046640 systemd[1]: Started session-19.scope - Session 19 of User core. Oct 28 00:17:17.165584 sshd[5219]: Connection closed by 139.178.68.195 port 37720 Oct 28 00:17:17.166013 sshd-session[5216]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:17.175269 systemd[1]: sshd@17-139.178.70.103:22-139.178.68.195:37720.service: Deactivated successfully. Oct 28 00:17:17.177896 systemd[1]: session-19.scope: Deactivated successfully. Oct 28 00:17:17.179504 systemd-logind[1652]: Session 19 logged out. Waiting for processes to exit. Oct 28 00:17:17.184783 systemd[1]: Started sshd@18-139.178.70.103:22-139.178.68.195:37730.service - OpenSSH per-connection server daemon (139.178.68.195:37730). Oct 28 00:17:17.187910 systemd-logind[1652]: Removed session 19. Oct 28 00:17:17.228394 sshd[5231]: Accepted publickey for core from 139.178.68.195 port 37730 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:17.229738 sshd-session[5231]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:17.234694 systemd-logind[1652]: New session 20 of user core. Oct 28 00:17:17.239658 systemd[1]: Started session-20.scope - Session 20 of User core. Oct 28 00:17:17.780077 sshd[5234]: Connection closed by 139.178.68.195 port 37730 Oct 28 00:17:17.783845 sshd-session[5231]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:17.788640 systemd[1]: Started sshd@19-139.178.70.103:22-139.178.68.195:37732.service - OpenSSH per-connection server daemon (139.178.68.195:37732). Oct 28 00:17:17.790574 systemd[1]: sshd@18-139.178.70.103:22-139.178.68.195:37730.service: Deactivated successfully. Oct 28 00:17:17.791890 systemd[1]: session-20.scope: Deactivated successfully. Oct 28 00:17:17.798589 systemd-logind[1652]: Session 20 logged out. Waiting for processes to exit. Oct 28 00:17:17.799672 systemd-logind[1652]: Removed session 20. Oct 28 00:17:17.902114 sshd[5241]: Accepted publickey for core from 139.178.68.195 port 37732 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:17.902975 sshd-session[5241]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:17.906203 systemd-logind[1652]: New session 21 of user core. Oct 28 00:17:17.912725 systemd[1]: Started session-21.scope - Session 21 of User core. Oct 28 00:17:18.367860 kubelet[2983]: E1028 00:17:18.367834 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-rx56r" podUID="c5b1e8a1-8c67-475a-a1aa-b28128f9865a" Oct 28 00:17:18.798477 sshd[5247]: Connection closed by 139.178.68.195 port 37732 Oct 28 00:17:18.808145 systemd[1]: Started sshd@20-139.178.70.103:22-139.178.68.195:37746.service - OpenSSH per-connection server daemon (139.178.68.195:37746). Oct 28 00:17:18.817932 sshd-session[5241]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:18.851978 systemd[1]: sshd@19-139.178.70.103:22-139.178.68.195:37732.service: Deactivated successfully. Oct 28 00:17:18.853320 systemd[1]: session-21.scope: Deactivated successfully. Oct 28 00:17:18.854278 systemd-logind[1652]: Session 21 logged out. Waiting for processes to exit. Oct 28 00:17:18.855135 systemd-logind[1652]: Removed session 21. Oct 28 00:17:18.915444 sshd[5258]: Accepted publickey for core from 139.178.68.195 port 37746 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:18.916799 sshd-session[5258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:18.919948 systemd-logind[1652]: New session 22 of user core. Oct 28 00:17:18.925634 systemd[1]: Started session-22.scope - Session 22 of User core. Oct 28 00:17:19.193800 sshd[5265]: Connection closed by 139.178.68.195 port 37746 Oct 28 00:17:19.194680 sshd-session[5258]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:19.203429 systemd[1]: sshd@20-139.178.70.103:22-139.178.68.195:37746.service: Deactivated successfully. Oct 28 00:17:19.205573 systemd[1]: session-22.scope: Deactivated successfully. Oct 28 00:17:19.207826 systemd-logind[1652]: Session 22 logged out. Waiting for processes to exit. Oct 28 00:17:19.210924 systemd[1]: Started sshd@21-139.178.70.103:22-139.178.68.195:37748.service - OpenSSH per-connection server daemon (139.178.68.195:37748). Oct 28 00:17:19.212153 systemd-logind[1652]: Removed session 22. Oct 28 00:17:19.264476 sshd[5275]: Accepted publickey for core from 139.178.68.195 port 37748 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:19.265228 sshd-session[5275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:19.268418 systemd-logind[1652]: New session 23 of user core. Oct 28 00:17:19.273577 systemd[1]: Started session-23.scope - Session 23 of User core. Oct 28 00:17:19.368627 kubelet[2983]: E1028 00:17:19.367427 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15" Oct 28 00:17:19.426274 sshd[5278]: Connection closed by 139.178.68.195 port 37748 Oct 28 00:17:19.426716 sshd-session[5275]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:19.429391 systemd[1]: sshd@21-139.178.70.103:22-139.178.68.195:37748.service: Deactivated successfully. Oct 28 00:17:19.431480 systemd[1]: session-23.scope: Deactivated successfully. Oct 28 00:17:19.433280 systemd-logind[1652]: Session 23 logged out. Waiting for processes to exit. Oct 28 00:17:19.435024 systemd-logind[1652]: Removed session 23. Oct 28 00:17:22.389554 kubelet[2983]: E1028 00:17:22.389340 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-525qc" podUID="f000d983-3d67-44b2-b245-9b82c5b15b84" Oct 28 00:17:24.367374 kubelet[2983]: E1028 00:17:24.367249 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6f9599dc7b-jdzs7" podUID="2e94a5f4-db66-4447-85c5-aa63d52ab9be" Oct 28 00:17:24.437631 systemd[1]: Started sshd@22-139.178.70.103:22-139.178.68.195:56932.service - OpenSSH per-connection server daemon (139.178.68.195:56932). Oct 28 00:17:24.477446 sshd[5296]: Accepted publickey for core from 139.178.68.195 port 56932 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:24.478219 sshd-session[5296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:24.481821 systemd-logind[1652]: New session 24 of user core. Oct 28 00:17:24.487850 systemd[1]: Started session-24.scope - Session 24 of User core. Oct 28 00:17:24.611556 sshd[5299]: Connection closed by 139.178.68.195 port 56932 Oct 28 00:17:24.612016 sshd-session[5296]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:24.617065 systemd[1]: sshd@22-139.178.70.103:22-139.178.68.195:56932.service: Deactivated successfully. Oct 28 00:17:24.619452 systemd[1]: session-24.scope: Deactivated successfully. Oct 28 00:17:24.621571 systemd-logind[1652]: Session 24 logged out. Waiting for processes to exit. Oct 28 00:17:24.623871 systemd-logind[1652]: Removed session 24. Oct 28 00:17:27.365799 kubelet[2983]: E1028 00:17:27.365688 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-58bff6789c-fght4" podUID="65ba14cb-b505-45aa-b041-c988e49efa4a" Oct 28 00:17:28.366648 kubelet[2983]: E1028 00:17:28.366410 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-ddf7d975-c6hwj" podUID="b2716210-391e-4394-893c-61a7addd4a59" Oct 28 00:17:29.620875 systemd[1]: Started sshd@23-139.178.70.103:22-139.178.68.195:56946.service - OpenSSH per-connection server daemon (139.178.68.195:56946). Oct 28 00:17:29.661884 sshd[5319]: Accepted publickey for core from 139.178.68.195 port 56946 ssh2: RSA SHA256:gvN99Fsf8MBCtYJz2AFeCMy8DpEvVsdcpqs+QYVMf5E Oct 28 00:17:29.663676 sshd-session[5319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 28 00:17:29.668487 systemd-logind[1652]: New session 25 of user core. Oct 28 00:17:29.672614 systemd[1]: Started session-25.scope - Session 25 of User core. Oct 28 00:17:30.060515 sshd[5322]: Connection closed by 139.178.68.195 port 56946 Oct 28 00:17:30.060831 sshd-session[5319]: pam_unix(sshd:session): session closed for user core Oct 28 00:17:30.062775 systemd[1]: sshd@23-139.178.70.103:22-139.178.68.195:56946.service: Deactivated successfully. Oct 28 00:17:30.064139 systemd[1]: session-25.scope: Deactivated successfully. Oct 28 00:17:30.065049 systemd-logind[1652]: Session 25 logged out. Waiting for processes to exit. Oct 28 00:17:30.065757 systemd-logind[1652]: Removed session 25. Oct 28 00:17:30.366228 kubelet[2983]: E1028 00:17:30.366131 2983 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7g789" podUID="f0632c4d-dd7d-4536-90e2-d70b340d8f15"