Nov 24 06:46:32.705368 kernel: Linux version 6.12.58-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Sun Nov 23 20:49:05 -00 2025 Nov 24 06:46:32.705384 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a5a093dfb613b73c778207057706f88d5254927e05ae90617f314b938bd34a14 Nov 24 06:46:32.705391 kernel: Disabled fast string operations Nov 24 06:46:32.705395 kernel: BIOS-provided physical RAM map: Nov 24 06:46:32.705399 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Nov 24 06:46:32.705403 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Nov 24 06:46:32.705408 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Nov 24 06:46:32.705413 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Nov 24 06:46:32.705417 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Nov 24 06:46:32.705422 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Nov 24 06:46:32.705426 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Nov 24 06:46:32.705430 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Nov 24 06:46:32.705434 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Nov 24 06:46:32.705439 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Nov 24 06:46:32.705445 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Nov 24 06:46:32.705450 kernel: NX (Execute Disable) protection: active Nov 24 06:46:32.705455 kernel: APIC: Static calls initialized Nov 24 06:46:32.705460 kernel: SMBIOS 2.7 present. Nov 24 06:46:32.705465 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Nov 24 06:46:32.705469 kernel: DMI: Memory slots populated: 1/128 Nov 24 06:46:32.705474 kernel: vmware: hypercall mode: 0x00 Nov 24 06:46:32.705479 kernel: Hypervisor detected: VMware Nov 24 06:46:32.705484 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Nov 24 06:46:32.705490 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Nov 24 06:46:32.705494 kernel: vmware: using clock offset of 5706741308 ns Nov 24 06:46:32.705499 kernel: tsc: Detected 3408.000 MHz processor Nov 24 06:46:32.705504 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Nov 24 06:46:32.705510 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Nov 24 06:46:32.705514 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Nov 24 06:46:32.705519 kernel: total RAM covered: 3072M Nov 24 06:46:32.705524 kernel: Found optimal setting for mtrr clean up Nov 24 06:46:32.705530 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Nov 24 06:46:32.705535 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Nov 24 06:46:32.705541 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Nov 24 06:46:32.705545 kernel: Using GB pages for direct mapping Nov 24 06:46:32.705550 kernel: ACPI: Early table checksum verification disabled Nov 24 06:46:32.705555 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Nov 24 06:46:32.705560 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Nov 24 06:46:32.705565 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Nov 24 06:46:32.705570 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Nov 24 06:46:32.705577 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Nov 24 06:46:32.705583 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Nov 24 06:46:32.705588 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Nov 24 06:46:32.705593 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Nov 24 06:46:32.705598 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Nov 24 06:46:32.705603 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Nov 24 06:46:32.705609 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Nov 24 06:46:32.705615 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Nov 24 06:46:32.705620 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Nov 24 06:46:32.705625 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Nov 24 06:46:32.705630 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Nov 24 06:46:32.705635 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Nov 24 06:46:32.705647 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Nov 24 06:46:32.705652 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Nov 24 06:46:32.705658 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Nov 24 06:46:32.705663 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Nov 24 06:46:32.705669 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Nov 24 06:46:32.705674 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Nov 24 06:46:32.705679 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Nov 24 06:46:32.705684 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Nov 24 06:46:32.705689 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Nov 24 06:46:32.705695 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Nov 24 06:46:32.705700 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Nov 24 06:46:32.705705 kernel: Zone ranges: Nov 24 06:46:32.705710 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Nov 24 06:46:32.705715 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Nov 24 06:46:32.705727 kernel: Normal empty Nov 24 06:46:32.705732 kernel: Device empty Nov 24 06:46:32.705737 kernel: Movable zone start for each node Nov 24 06:46:32.705742 kernel: Early memory node ranges Nov 24 06:46:32.705747 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Nov 24 06:46:32.705752 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Nov 24 06:46:32.705758 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Nov 24 06:46:32.705763 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Nov 24 06:46:32.705768 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Nov 24 06:46:32.705781 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Nov 24 06:46:32.705788 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Nov 24 06:46:32.705793 kernel: ACPI: PM-Timer IO Port: 0x1008 Nov 24 06:46:32.705798 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Nov 24 06:46:32.705803 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Nov 24 06:46:32.705808 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Nov 24 06:46:32.705816 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Nov 24 06:46:32.705822 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Nov 24 06:46:32.705827 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Nov 24 06:46:32.705832 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Nov 24 06:46:32.705839 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Nov 24 06:46:32.705844 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Nov 24 06:46:32.705853 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Nov 24 06:46:32.705858 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Nov 24 06:46:32.705863 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Nov 24 06:46:32.705868 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Nov 24 06:46:32.705873 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Nov 24 06:46:32.705878 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Nov 24 06:46:32.705886 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Nov 24 06:46:32.705893 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Nov 24 06:46:32.705898 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Nov 24 06:46:32.705903 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Nov 24 06:46:32.705908 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Nov 24 06:46:32.705915 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Nov 24 06:46:32.705921 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Nov 24 06:46:32.705926 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Nov 24 06:46:32.705931 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Nov 24 06:46:32.705936 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Nov 24 06:46:32.705941 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Nov 24 06:46:32.705950 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Nov 24 06:46:32.705956 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Nov 24 06:46:32.705961 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Nov 24 06:46:32.705966 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Nov 24 06:46:32.705971 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Nov 24 06:46:32.705976 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Nov 24 06:46:32.705984 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Nov 24 06:46:32.705989 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Nov 24 06:46:32.705994 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Nov 24 06:46:32.705999 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Nov 24 06:46:32.706005 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Nov 24 06:46:32.706011 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Nov 24 06:46:32.706016 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Nov 24 06:46:32.706022 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Nov 24 06:46:32.706031 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Nov 24 06:46:32.706036 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Nov 24 06:46:32.706042 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Nov 24 06:46:32.706047 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Nov 24 06:46:32.706053 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Nov 24 06:46:32.706059 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Nov 24 06:46:32.706064 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Nov 24 06:46:32.706069 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Nov 24 06:46:32.706075 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Nov 24 06:46:32.706080 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Nov 24 06:46:32.706085 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Nov 24 06:46:32.706090 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Nov 24 06:46:32.706095 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Nov 24 06:46:32.706101 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Nov 24 06:46:32.706107 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Nov 24 06:46:32.706112 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Nov 24 06:46:32.706118 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Nov 24 06:46:32.706123 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Nov 24 06:46:32.706128 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Nov 24 06:46:32.706134 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Nov 24 06:46:32.706139 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Nov 24 06:46:32.706144 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Nov 24 06:46:32.706150 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Nov 24 06:46:32.706156 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Nov 24 06:46:32.706162 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Nov 24 06:46:32.706167 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Nov 24 06:46:32.706172 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Nov 24 06:46:32.706177 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Nov 24 06:46:32.706182 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Nov 24 06:46:32.706188 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Nov 24 06:46:32.706193 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Nov 24 06:46:32.706198 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Nov 24 06:46:32.706204 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Nov 24 06:46:32.706210 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Nov 24 06:46:32.706215 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Nov 24 06:46:32.706220 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Nov 24 06:46:32.706226 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Nov 24 06:46:32.706231 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Nov 24 06:46:32.706236 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Nov 24 06:46:32.706242 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Nov 24 06:46:32.706247 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Nov 24 06:46:32.706252 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Nov 24 06:46:32.706258 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Nov 24 06:46:32.706265 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Nov 24 06:46:32.706270 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Nov 24 06:46:32.706275 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Nov 24 06:46:32.706281 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Nov 24 06:46:32.706286 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Nov 24 06:46:32.706291 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Nov 24 06:46:32.706297 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Nov 24 06:46:32.706302 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Nov 24 06:46:32.706307 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Nov 24 06:46:32.706313 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Nov 24 06:46:32.706319 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Nov 24 06:46:32.706324 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Nov 24 06:46:32.706330 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Nov 24 06:46:32.706335 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Nov 24 06:46:32.706340 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Nov 24 06:46:32.706345 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Nov 24 06:46:32.706351 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Nov 24 06:46:32.706356 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Nov 24 06:46:32.706362 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Nov 24 06:46:32.706368 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Nov 24 06:46:32.706373 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Nov 24 06:46:32.706379 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Nov 24 06:46:32.706384 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Nov 24 06:46:32.706389 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Nov 24 06:46:32.706395 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Nov 24 06:46:32.706400 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Nov 24 06:46:32.706409 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Nov 24 06:46:32.706414 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Nov 24 06:46:32.706419 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Nov 24 06:46:32.706426 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Nov 24 06:46:32.706432 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Nov 24 06:46:32.706437 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Nov 24 06:46:32.706442 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Nov 24 06:46:32.706448 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Nov 24 06:46:32.706453 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Nov 24 06:46:32.706458 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Nov 24 06:46:32.706464 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Nov 24 06:46:32.706469 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Nov 24 06:46:32.706474 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Nov 24 06:46:32.706481 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Nov 24 06:46:32.706486 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Nov 24 06:46:32.706491 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Nov 24 06:46:32.706496 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Nov 24 06:46:32.706502 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Nov 24 06:46:32.706507 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Nov 24 06:46:32.706512 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Nov 24 06:46:32.706518 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Nov 24 06:46:32.706523 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Nov 24 06:46:32.706529 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Nov 24 06:46:32.706535 kernel: TSC deadline timer available Nov 24 06:46:32.706541 kernel: CPU topo: Max. logical packages: 128 Nov 24 06:46:32.706546 kernel: CPU topo: Max. logical dies: 128 Nov 24 06:46:32.706560 kernel: CPU topo: Max. dies per package: 1 Nov 24 06:46:32.706567 kernel: CPU topo: Max. threads per core: 1 Nov 24 06:46:32.706572 kernel: CPU topo: Num. cores per package: 1 Nov 24 06:46:32.706577 kernel: CPU topo: Num. threads per package: 1 Nov 24 06:46:32.706583 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Nov 24 06:46:32.706592 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Nov 24 06:46:32.706599 kernel: Booting paravirtualized kernel on VMware hypervisor Nov 24 06:46:32.706605 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Nov 24 06:46:32.706611 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Nov 24 06:46:32.706616 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Nov 24 06:46:32.706622 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Nov 24 06:46:32.706627 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Nov 24 06:46:32.706632 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Nov 24 06:46:32.707019 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Nov 24 06:46:32.707028 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Nov 24 06:46:32.707035 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Nov 24 06:46:32.707041 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Nov 24 06:46:32.707046 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Nov 24 06:46:32.707051 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Nov 24 06:46:32.707057 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Nov 24 06:46:32.707062 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Nov 24 06:46:32.707067 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Nov 24 06:46:32.707073 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Nov 24 06:46:32.707078 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Nov 24 06:46:32.707085 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Nov 24 06:46:32.707090 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Nov 24 06:46:32.707096 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Nov 24 06:46:32.707102 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a5a093dfb613b73c778207057706f88d5254927e05ae90617f314b938bd34a14 Nov 24 06:46:32.707108 kernel: random: crng init done Nov 24 06:46:32.707113 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Nov 24 06:46:32.707119 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Nov 24 06:46:32.707125 kernel: printk: log_buf_len min size: 262144 bytes Nov 24 06:46:32.707131 kernel: printk: log_buf_len: 1048576 bytes Nov 24 06:46:32.707137 kernel: printk: early log buf free: 245704(93%) Nov 24 06:46:32.707142 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Nov 24 06:46:32.707148 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Nov 24 06:46:32.707153 kernel: Fallback order for Node 0: 0 Nov 24 06:46:32.707159 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Nov 24 06:46:32.707164 kernel: Policy zone: DMA32 Nov 24 06:46:32.707170 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Nov 24 06:46:32.707176 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Nov 24 06:46:32.707182 kernel: ftrace: allocating 40103 entries in 157 pages Nov 24 06:46:32.707188 kernel: ftrace: allocated 157 pages with 5 groups Nov 24 06:46:32.707193 kernel: Dynamic Preempt: voluntary Nov 24 06:46:32.707199 kernel: rcu: Preemptible hierarchical RCU implementation. Nov 24 06:46:32.707204 kernel: rcu: RCU event tracing is enabled. Nov 24 06:46:32.707210 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Nov 24 06:46:32.707215 kernel: Trampoline variant of Tasks RCU enabled. Nov 24 06:46:32.707221 kernel: Rude variant of Tasks RCU enabled. Nov 24 06:46:32.707227 kernel: Tracing variant of Tasks RCU enabled. Nov 24 06:46:32.707233 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Nov 24 06:46:32.707239 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Nov 24 06:46:32.707244 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Nov 24 06:46:32.707250 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Nov 24 06:46:32.707255 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Nov 24 06:46:32.707261 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Nov 24 06:46:32.707266 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Nov 24 06:46:32.707272 kernel: Console: colour VGA+ 80x25 Nov 24 06:46:32.707277 kernel: printk: legacy console [tty0] enabled Nov 24 06:46:32.707284 kernel: printk: legacy console [ttyS0] enabled Nov 24 06:46:32.707290 kernel: ACPI: Core revision 20240827 Nov 24 06:46:32.707295 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Nov 24 06:46:32.707301 kernel: APIC: Switch to symmetric I/O mode setup Nov 24 06:46:32.707306 kernel: x2apic enabled Nov 24 06:46:32.707312 kernel: APIC: Switched APIC routing to: physical x2apic Nov 24 06:46:32.707317 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Nov 24 06:46:32.707323 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Nov 24 06:46:32.707328 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Nov 24 06:46:32.707335 kernel: Disabled fast string operations Nov 24 06:46:32.707340 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Nov 24 06:46:32.707346 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Nov 24 06:46:32.707351 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Nov 24 06:46:32.707356 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Nov 24 06:46:32.707362 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Nov 24 06:46:32.707368 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Nov 24 06:46:32.707373 kernel: RETBleed: Mitigation: Enhanced IBRS Nov 24 06:46:32.707379 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Nov 24 06:46:32.707385 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Nov 24 06:46:32.707391 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Nov 24 06:46:32.707396 kernel: SRBDS: Unknown: Dependent on hypervisor status Nov 24 06:46:32.707401 kernel: GDS: Unknown: Dependent on hypervisor status Nov 24 06:46:32.707407 kernel: active return thunk: its_return_thunk Nov 24 06:46:32.707412 kernel: ITS: Mitigation: Aligned branch/return thunks Nov 24 06:46:32.707418 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Nov 24 06:46:32.707423 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Nov 24 06:46:32.707429 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Nov 24 06:46:32.707435 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Nov 24 06:46:32.707441 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Nov 24 06:46:32.707450 kernel: Freeing SMP alternatives memory: 32K Nov 24 06:46:32.707467 kernel: pid_max: default: 131072 minimum: 1024 Nov 24 06:46:32.707474 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Nov 24 06:46:32.707479 kernel: landlock: Up and running. Nov 24 06:46:32.707485 kernel: SELinux: Initializing. Nov 24 06:46:32.707491 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Nov 24 06:46:32.707500 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Nov 24 06:46:32.707515 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Nov 24 06:46:32.707525 kernel: Performance Events: Skylake events, core PMU driver. Nov 24 06:46:32.707532 kernel: core: CPUID marked event: 'cpu cycles' unavailable Nov 24 06:46:32.707538 kernel: core: CPUID marked event: 'instructions' unavailable Nov 24 06:46:32.707543 kernel: core: CPUID marked event: 'bus cycles' unavailable Nov 24 06:46:32.707549 kernel: core: CPUID marked event: 'cache references' unavailable Nov 24 06:46:32.707554 kernel: core: CPUID marked event: 'cache misses' unavailable Nov 24 06:46:32.707559 kernel: core: CPUID marked event: 'branch instructions' unavailable Nov 24 06:46:32.707565 kernel: core: CPUID marked event: 'branch misses' unavailable Nov 24 06:46:32.707574 kernel: ... version: 1 Nov 24 06:46:32.707581 kernel: ... bit width: 48 Nov 24 06:46:32.707586 kernel: ... generic registers: 4 Nov 24 06:46:32.707592 kernel: ... value mask: 0000ffffffffffff Nov 24 06:46:32.707597 kernel: ... max period: 000000007fffffff Nov 24 06:46:32.707603 kernel: ... fixed-purpose events: 0 Nov 24 06:46:32.707608 kernel: ... event mask: 000000000000000f Nov 24 06:46:32.707614 kernel: signal: max sigframe size: 1776 Nov 24 06:46:32.707619 kernel: rcu: Hierarchical SRCU implementation. Nov 24 06:46:32.707626 kernel: rcu: Max phase no-delay instances is 400. Nov 24 06:46:32.707631 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Nov 24 06:46:32.707643 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Nov 24 06:46:32.707649 kernel: smp: Bringing up secondary CPUs ... Nov 24 06:46:32.707655 kernel: smpboot: x86: Booting SMP configuration: Nov 24 06:46:32.707660 kernel: .... node #0, CPUs: #1 Nov 24 06:46:32.707665 kernel: Disabled fast string operations Nov 24 06:46:32.707671 kernel: smp: Brought up 1 node, 2 CPUs Nov 24 06:46:32.707676 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Nov 24 06:46:32.707683 kernel: Memory: 1916040K/2096628K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46200K init, 2560K bss, 169204K reserved, 0K cma-reserved) Nov 24 06:46:32.707689 kernel: devtmpfs: initialized Nov 24 06:46:32.707694 kernel: x86/mm: Memory block size: 128MB Nov 24 06:46:32.707700 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Nov 24 06:46:32.707705 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Nov 24 06:46:32.707711 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Nov 24 06:46:32.707716 kernel: pinctrl core: initialized pinctrl subsystem Nov 24 06:46:32.707722 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Nov 24 06:46:32.707728 kernel: audit: initializing netlink subsys (disabled) Nov 24 06:46:32.707734 kernel: audit: type=2000 audit(1763966789.287:1): state=initialized audit_enabled=0 res=1 Nov 24 06:46:32.707740 kernel: thermal_sys: Registered thermal governor 'step_wise' Nov 24 06:46:32.707745 kernel: thermal_sys: Registered thermal governor 'user_space' Nov 24 06:46:32.707751 kernel: cpuidle: using governor menu Nov 24 06:46:32.707756 kernel: Simple Boot Flag at 0x36 set to 0x80 Nov 24 06:46:32.707761 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Nov 24 06:46:32.707767 kernel: dca service started, version 1.12.1 Nov 24 06:46:32.707772 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Nov 24 06:46:32.707785 kernel: PCI: Using configuration type 1 for base access Nov 24 06:46:32.707793 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Nov 24 06:46:32.707799 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Nov 24 06:46:32.707805 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Nov 24 06:46:32.707810 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Nov 24 06:46:32.707816 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Nov 24 06:46:32.707822 kernel: ACPI: Added _OSI(Module Device) Nov 24 06:46:32.707827 kernel: ACPI: Added _OSI(Processor Device) Nov 24 06:46:32.707833 kernel: ACPI: Added _OSI(Processor Aggregator Device) Nov 24 06:46:32.707839 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Nov 24 06:46:32.707846 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Nov 24 06:46:32.707851 kernel: ACPI: Interpreter enabled Nov 24 06:46:32.707857 kernel: ACPI: PM: (supports S0 S1 S5) Nov 24 06:46:32.707863 kernel: ACPI: Using IOAPIC for interrupt routing Nov 24 06:46:32.707868 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Nov 24 06:46:32.707874 kernel: PCI: Using E820 reservations for host bridge windows Nov 24 06:46:32.707880 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Nov 24 06:46:32.707886 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Nov 24 06:46:32.707971 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Nov 24 06:46:32.708027 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Nov 24 06:46:32.708077 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Nov 24 06:46:32.708085 kernel: PCI host bridge to bus 0000:00 Nov 24 06:46:32.708135 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Nov 24 06:46:32.708181 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Nov 24 06:46:32.708224 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Nov 24 06:46:32.708269 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Nov 24 06:46:32.708312 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Nov 24 06:46:32.708355 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Nov 24 06:46:32.708417 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Nov 24 06:46:32.708474 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Nov 24 06:46:32.708526 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Nov 24 06:46:32.708584 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Nov 24 06:46:32.708648 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Nov 24 06:46:32.708700 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Nov 24 06:46:32.710701 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Nov 24 06:46:32.710769 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Nov 24 06:46:32.710833 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Nov 24 06:46:32.710900 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Nov 24 06:46:32.710974 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Nov 24 06:46:32.711033 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Nov 24 06:46:32.711084 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Nov 24 06:46:32.711144 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Nov 24 06:46:32.711205 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Nov 24 06:46:32.711260 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Nov 24 06:46:32.711313 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Nov 24 06:46:32.711363 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Nov 24 06:46:32.711412 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Nov 24 06:46:32.711461 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Nov 24 06:46:32.711513 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Nov 24 06:46:32.711561 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Nov 24 06:46:32.711615 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Nov 24 06:46:32.712699 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Nov 24 06:46:32.712769 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Nov 24 06:46:32.712822 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Nov 24 06:46:32.712873 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Nov 24 06:46:32.712935 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.712988 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Nov 24 06:46:32.713039 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Nov 24 06:46:32.713089 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Nov 24 06:46:32.713139 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.713193 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.713246 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Nov 24 06:46:32.713299 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Nov 24 06:46:32.713350 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Nov 24 06:46:32.713405 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Nov 24 06:46:32.713457 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.713512 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.713571 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Nov 24 06:46:32.713634 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Nov 24 06:46:32.716558 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Nov 24 06:46:32.716616 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Nov 24 06:46:32.716679 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.716737 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.716791 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Nov 24 06:46:32.716849 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Nov 24 06:46:32.716912 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Nov 24 06:46:32.716963 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.717019 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.717070 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Nov 24 06:46:32.717120 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Nov 24 06:46:32.717170 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Nov 24 06:46:32.717220 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.717294 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.717375 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Nov 24 06:46:32.717426 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Nov 24 06:46:32.717477 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Nov 24 06:46:32.717528 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.717582 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.717633 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Nov 24 06:46:32.717699 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Nov 24 06:46:32.717753 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Nov 24 06:46:32.717811 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.717868 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.717919 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Nov 24 06:46:32.717969 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Nov 24 06:46:32.718018 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Nov 24 06:46:32.718068 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.718125 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.718175 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Nov 24 06:46:32.718225 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Nov 24 06:46:32.718274 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Nov 24 06:46:32.718323 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.718379 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.718430 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Nov 24 06:46:32.718482 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Nov 24 06:46:32.718531 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Nov 24 06:46:32.718581 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Nov 24 06:46:32.718629 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.720398 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.720455 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Nov 24 06:46:32.720506 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Nov 24 06:46:32.720560 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Nov 24 06:46:32.720610 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Nov 24 06:46:32.721298 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.721362 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.721416 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Nov 24 06:46:32.721467 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Nov 24 06:46:32.721519 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Nov 24 06:46:32.721572 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.721626 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.721737 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Nov 24 06:46:32.721789 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Nov 24 06:46:32.721853 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Nov 24 06:46:32.721905 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.721971 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.722027 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Nov 24 06:46:32.722077 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Nov 24 06:46:32.722130 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Nov 24 06:46:32.722180 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.722235 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.722286 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Nov 24 06:46:32.722335 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Nov 24 06:46:32.722384 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Nov 24 06:46:32.722437 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.722492 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.722542 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Nov 24 06:46:32.722592 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Nov 24 06:46:32.722661 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Nov 24 06:46:32.722716 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.722770 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.722825 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Nov 24 06:46:32.722874 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Nov 24 06:46:32.722924 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Nov 24 06:46:32.722980 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Nov 24 06:46:32.723031 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.723089 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.723140 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Nov 24 06:46:32.723193 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Nov 24 06:46:32.723244 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Nov 24 06:46:32.723293 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Nov 24 06:46:32.723342 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.723401 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.723452 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Nov 24 06:46:32.723504 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Nov 24 06:46:32.723591 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Nov 24 06:46:32.723668 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Nov 24 06:46:32.723722 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.723782 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.723891 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Nov 24 06:46:32.723962 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Nov 24 06:46:32.724017 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Nov 24 06:46:32.724067 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.724122 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.724173 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Nov 24 06:46:32.726470 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Nov 24 06:46:32.726583 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Nov 24 06:46:32.726665 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.726743 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.726806 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Nov 24 06:46:32.726873 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Nov 24 06:46:32.726931 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Nov 24 06:46:32.726993 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.727171 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.727253 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Nov 24 06:46:32.727312 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Nov 24 06:46:32.727369 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Nov 24 06:46:32.727421 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.727476 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.727528 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Nov 24 06:46:32.727583 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Nov 24 06:46:32.727634 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Nov 24 06:46:32.727694 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.727749 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.727800 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Nov 24 06:46:32.727851 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Nov 24 06:46:32.727900 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Nov 24 06:46:32.727954 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Nov 24 06:46:32.728005 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.728062 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.728113 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Nov 24 06:46:32.728165 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Nov 24 06:46:32.728239 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Nov 24 06:46:32.728315 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Nov 24 06:46:32.728401 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.728487 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.728549 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Nov 24 06:46:32.728601 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Nov 24 06:46:32.732713 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Nov 24 06:46:32.732800 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.732864 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.732927 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Nov 24 06:46:32.732980 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Nov 24 06:46:32.733031 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Nov 24 06:46:32.733085 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.733152 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.733206 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Nov 24 06:46:32.733256 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Nov 24 06:46:32.733310 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Nov 24 06:46:32.733360 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.733415 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.733465 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Nov 24 06:46:32.733515 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Nov 24 06:46:32.733566 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Nov 24 06:46:32.733615 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.734813 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.734886 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Nov 24 06:46:32.734940 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Nov 24 06:46:32.735719 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Nov 24 06:46:32.735787 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.735851 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:46:32.735909 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Nov 24 06:46:32.735964 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Nov 24 06:46:32.736024 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Nov 24 06:46:32.736078 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.736140 kernel: pci_bus 0000:01: extended config space not accessible Nov 24 06:46:32.736199 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Nov 24 06:46:32.736256 kernel: pci_bus 0000:02: extended config space not accessible Nov 24 06:46:32.736266 kernel: acpiphp: Slot [32] registered Nov 24 06:46:32.736272 kernel: acpiphp: Slot [33] registered Nov 24 06:46:32.736280 kernel: acpiphp: Slot [34] registered Nov 24 06:46:32.736286 kernel: acpiphp: Slot [35] registered Nov 24 06:46:32.736292 kernel: acpiphp: Slot [36] registered Nov 24 06:46:32.736298 kernel: acpiphp: Slot [37] registered Nov 24 06:46:32.736304 kernel: acpiphp: Slot [38] registered Nov 24 06:46:32.736309 kernel: acpiphp: Slot [39] registered Nov 24 06:46:32.736315 kernel: acpiphp: Slot [40] registered Nov 24 06:46:32.736321 kernel: acpiphp: Slot [41] registered Nov 24 06:46:32.736327 kernel: acpiphp: Slot [42] registered Nov 24 06:46:32.736332 kernel: acpiphp: Slot [43] registered Nov 24 06:46:32.736340 kernel: acpiphp: Slot [44] registered Nov 24 06:46:32.736345 kernel: acpiphp: Slot [45] registered Nov 24 06:46:32.736351 kernel: acpiphp: Slot [46] registered Nov 24 06:46:32.736357 kernel: acpiphp: Slot [47] registered Nov 24 06:46:32.736363 kernel: acpiphp: Slot [48] registered Nov 24 06:46:32.736369 kernel: acpiphp: Slot [49] registered Nov 24 06:46:32.736374 kernel: acpiphp: Slot [50] registered Nov 24 06:46:32.736380 kernel: acpiphp: Slot [51] registered Nov 24 06:46:32.736386 kernel: acpiphp: Slot [52] registered Nov 24 06:46:32.736393 kernel: acpiphp: Slot [53] registered Nov 24 06:46:32.736399 kernel: acpiphp: Slot [54] registered Nov 24 06:46:32.736405 kernel: acpiphp: Slot [55] registered Nov 24 06:46:32.736411 kernel: acpiphp: Slot [56] registered Nov 24 06:46:32.736416 kernel: acpiphp: Slot [57] registered Nov 24 06:46:32.736422 kernel: acpiphp: Slot [58] registered Nov 24 06:46:32.736428 kernel: acpiphp: Slot [59] registered Nov 24 06:46:32.736433 kernel: acpiphp: Slot [60] registered Nov 24 06:46:32.736439 kernel: acpiphp: Slot [61] registered Nov 24 06:46:32.736445 kernel: acpiphp: Slot [62] registered Nov 24 06:46:32.736452 kernel: acpiphp: Slot [63] registered Nov 24 06:46:32.736506 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Nov 24 06:46:32.736561 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Nov 24 06:46:32.736614 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Nov 24 06:46:32.737217 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Nov 24 06:46:32.737276 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Nov 24 06:46:32.737329 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Nov 24 06:46:32.737390 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Nov 24 06:46:32.737449 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Nov 24 06:46:32.737503 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Nov 24 06:46:32.737556 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Nov 24 06:46:32.737608 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Nov 24 06:46:32.737683 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Nov 24 06:46:32.737738 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Nov 24 06:46:32.737796 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Nov 24 06:46:32.737863 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Nov 24 06:46:32.737918 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Nov 24 06:46:32.737972 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Nov 24 06:46:32.738025 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Nov 24 06:46:32.738078 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Nov 24 06:46:32.738133 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Nov 24 06:46:32.738192 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Nov 24 06:46:32.738248 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Nov 24 06:46:32.738300 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Nov 24 06:46:32.738353 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Nov 24 06:46:32.738405 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Nov 24 06:46:32.738457 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Nov 24 06:46:32.738508 kernel: pci 0000:0b:00.0: supports D1 D2 Nov 24 06:46:32.738560 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Nov 24 06:46:32.738615 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Nov 24 06:46:32.738678 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Nov 24 06:46:32.738732 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Nov 24 06:46:32.738787 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Nov 24 06:46:32.738841 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Nov 24 06:46:32.738895 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Nov 24 06:46:32.738950 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Nov 24 06:46:32.739004 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Nov 24 06:46:32.739061 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Nov 24 06:46:32.739116 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Nov 24 06:46:32.739169 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Nov 24 06:46:32.739223 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Nov 24 06:46:32.739276 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Nov 24 06:46:32.739329 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Nov 24 06:46:32.739384 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Nov 24 06:46:32.739441 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Nov 24 06:46:32.739495 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Nov 24 06:46:32.739550 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Nov 24 06:46:32.739604 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Nov 24 06:46:32.739706 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Nov 24 06:46:32.739761 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Nov 24 06:46:32.739816 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Nov 24 06:46:32.739873 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Nov 24 06:46:32.739927 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Nov 24 06:46:32.739981 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Nov 24 06:46:32.739991 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Nov 24 06:46:32.739997 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Nov 24 06:46:32.740004 kernel: ACPI: PCI: Interrupt link LNKB disabled Nov 24 06:46:32.740010 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Nov 24 06:46:32.740016 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Nov 24 06:46:32.740024 kernel: iommu: Default domain type: Translated Nov 24 06:46:32.740030 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Nov 24 06:46:32.740036 kernel: PCI: Using ACPI for IRQ routing Nov 24 06:46:32.740042 kernel: PCI: pci_cache_line_size set to 64 bytes Nov 24 06:46:32.740048 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Nov 24 06:46:32.740054 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Nov 24 06:46:32.740105 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Nov 24 06:46:32.740156 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Nov 24 06:46:32.740209 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Nov 24 06:46:32.740221 kernel: vgaarb: loaded Nov 24 06:46:32.740228 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Nov 24 06:46:32.740233 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Nov 24 06:46:32.740239 kernel: clocksource: Switched to clocksource tsc-early Nov 24 06:46:32.740245 kernel: VFS: Disk quotas dquot_6.6.0 Nov 24 06:46:32.740251 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Nov 24 06:46:32.740258 kernel: pnp: PnP ACPI init Nov 24 06:46:32.740319 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Nov 24 06:46:32.740371 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Nov 24 06:46:32.740417 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Nov 24 06:46:32.740472 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Nov 24 06:46:32.740523 kernel: pnp 00:06: [dma 2] Nov 24 06:46:32.740573 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Nov 24 06:46:32.740619 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Nov 24 06:46:32.741070 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Nov 24 06:46:32.741085 kernel: pnp: PnP ACPI: found 8 devices Nov 24 06:46:32.741091 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Nov 24 06:46:32.741097 kernel: NET: Registered PF_INET protocol family Nov 24 06:46:32.741103 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 24 06:46:32.741110 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Nov 24 06:46:32.741116 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Nov 24 06:46:32.741121 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Nov 24 06:46:32.741128 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Nov 24 06:46:32.741133 kernel: TCP: Hash tables configured (established 16384 bind 16384) Nov 24 06:46:32.741141 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Nov 24 06:46:32.741147 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Nov 24 06:46:32.741153 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Nov 24 06:46:32.741159 kernel: NET: Registered PF_XDP protocol family Nov 24 06:46:32.741214 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Nov 24 06:46:32.741270 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Nov 24 06:46:32.741325 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Nov 24 06:46:32.741380 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Nov 24 06:46:32.741437 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Nov 24 06:46:32.741491 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Nov 24 06:46:32.741544 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Nov 24 06:46:32.741599 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Nov 24 06:46:32.741692 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Nov 24 06:46:32.741748 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Nov 24 06:46:32.741801 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Nov 24 06:46:32.741863 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Nov 24 06:46:32.741921 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Nov 24 06:46:32.741975 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Nov 24 06:46:32.742028 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Nov 24 06:46:32.742081 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Nov 24 06:46:32.742134 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Nov 24 06:46:32.742187 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Nov 24 06:46:32.742239 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Nov 24 06:46:32.742294 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Nov 24 06:46:32.742347 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Nov 24 06:46:32.742402 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Nov 24 06:46:32.742455 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Nov 24 06:46:32.742508 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Nov 24 06:46:32.742560 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Nov 24 06:46:32.742613 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.742681 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.742738 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.742788 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.742841 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.742902 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.742981 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743041 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.743096 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743156 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.743215 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743273 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.743329 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743380 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.743454 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743536 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.743613 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743690 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.743759 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743818 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.743885 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.743938 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744012 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744073 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744132 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744188 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744262 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744321 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744389 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744444 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744497 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744548 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744625 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744700 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744762 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744829 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.744892 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.744961 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745034 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.745087 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745158 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.745229 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745304 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.745371 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745444 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.745496 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745562 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.745649 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745733 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.745796 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745856 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.745928 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.745989 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746041 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.746094 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746153 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.746227 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746311 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.746396 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746463 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.746534 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746597 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.746675 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746728 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.746792 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746856 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.746919 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.746983 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747045 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747108 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747170 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747233 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747295 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747346 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747403 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747463 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747521 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747575 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747648 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747708 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747778 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747843 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.747929 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:46:32.747992 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:46:32.748047 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Nov 24 06:46:32.748103 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Nov 24 06:46:32.748166 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Nov 24 06:46:32.748231 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Nov 24 06:46:32.748305 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Nov 24 06:46:32.748363 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Nov 24 06:46:32.748416 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Nov 24 06:46:32.748475 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Nov 24 06:46:32.748527 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Nov 24 06:46:32.748597 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Nov 24 06:46:32.748682 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Nov 24 06:46:32.749659 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Nov 24 06:46:32.749759 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Nov 24 06:46:32.749825 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Nov 24 06:46:32.749895 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Nov 24 06:46:32.749955 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Nov 24 06:46:32.750016 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Nov 24 06:46:32.750068 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Nov 24 06:46:32.750121 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Nov 24 06:46:32.750192 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Nov 24 06:46:32.750246 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Nov 24 06:46:32.750318 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Nov 24 06:46:32.750377 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Nov 24 06:46:32.750429 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Nov 24 06:46:32.750489 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Nov 24 06:46:32.750559 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Nov 24 06:46:32.750630 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Nov 24 06:46:32.750727 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Nov 24 06:46:32.750783 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Nov 24 06:46:32.750850 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Nov 24 06:46:32.750919 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Nov 24 06:46:32.750974 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Nov 24 06:46:32.751042 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Nov 24 06:46:32.751116 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Nov 24 06:46:32.751177 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Nov 24 06:46:32.751239 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Nov 24 06:46:32.751308 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Nov 24 06:46:32.751365 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Nov 24 06:46:32.751422 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Nov 24 06:46:32.751477 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Nov 24 06:46:32.751532 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Nov 24 06:46:32.751583 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Nov 24 06:46:32.751652 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Nov 24 06:46:32.751727 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Nov 24 06:46:32.751783 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Nov 24 06:46:32.751845 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Nov 24 06:46:32.751902 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Nov 24 06:46:32.751953 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Nov 24 06:46:32.752006 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Nov 24 06:46:32.752076 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Nov 24 06:46:32.752144 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Nov 24 06:46:32.752208 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Nov 24 06:46:32.752275 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Nov 24 06:46:32.752337 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Nov 24 06:46:32.752389 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Nov 24 06:46:32.752444 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Nov 24 06:46:32.752497 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Nov 24 06:46:32.752556 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Nov 24 06:46:32.752631 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Nov 24 06:46:32.752697 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Nov 24 06:46:32.752757 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Nov 24 06:46:32.752810 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Nov 24 06:46:32.752868 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Nov 24 06:46:32.752927 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Nov 24 06:46:32.753006 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Nov 24 06:46:32.753070 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Nov 24 06:46:32.753148 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Nov 24 06:46:32.753201 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Nov 24 06:46:32.753251 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Nov 24 06:46:32.753303 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Nov 24 06:46:32.753354 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Nov 24 06:46:32.753411 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Nov 24 06:46:32.753474 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Nov 24 06:46:32.753543 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Nov 24 06:46:32.753607 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Nov 24 06:46:32.753688 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Nov 24 06:46:32.753754 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Nov 24 06:46:32.753817 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Nov 24 06:46:32.753886 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Nov 24 06:46:32.753943 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Nov 24 06:46:32.753997 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Nov 24 06:46:32.754051 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Nov 24 06:46:32.754105 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Nov 24 06:46:32.754162 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Nov 24 06:46:32.754224 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Nov 24 06:46:32.754293 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Nov 24 06:46:32.754367 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Nov 24 06:46:32.754425 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Nov 24 06:46:32.754482 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Nov 24 06:46:32.754552 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Nov 24 06:46:32.754623 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Nov 24 06:46:32.754703 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Nov 24 06:46:32.754777 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Nov 24 06:46:32.754842 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Nov 24 06:46:32.754894 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Nov 24 06:46:32.754945 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Nov 24 06:46:32.755005 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Nov 24 06:46:32.755086 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Nov 24 06:46:32.755166 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Nov 24 06:46:32.755238 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Nov 24 06:46:32.755309 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Nov 24 06:46:32.755381 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Nov 24 06:46:32.755457 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Nov 24 06:46:32.755511 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Nov 24 06:46:32.755577 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Nov 24 06:46:32.755670 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Nov 24 06:46:32.755743 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Nov 24 06:46:32.755800 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Nov 24 06:46:32.755859 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Nov 24 06:46:32.755916 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Nov 24 06:46:32.755999 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Nov 24 06:46:32.756083 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Nov 24 06:46:32.756160 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Nov 24 06:46:32.756236 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Nov 24 06:46:32.756309 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Nov 24 06:46:32.756358 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Nov 24 06:46:32.756421 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Nov 24 06:46:32.756470 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Nov 24 06:46:32.756518 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Nov 24 06:46:32.756590 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Nov 24 06:46:32.756660 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Nov 24 06:46:32.756721 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Nov 24 06:46:32.756787 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Nov 24 06:46:32.756835 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Nov 24 06:46:32.756910 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Nov 24 06:46:32.756987 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Nov 24 06:46:32.757057 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Nov 24 06:46:32.757114 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Nov 24 06:46:32.757165 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Nov 24 06:46:32.757212 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Nov 24 06:46:32.757262 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Nov 24 06:46:32.757330 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Nov 24 06:46:32.757407 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Nov 24 06:46:32.757490 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Nov 24 06:46:32.757539 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Nov 24 06:46:32.757589 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Nov 24 06:46:32.757689 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Nov 24 06:46:32.757767 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Nov 24 06:46:32.757841 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Nov 24 06:46:32.757906 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Nov 24 06:46:32.757962 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Nov 24 06:46:32.758013 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Nov 24 06:46:32.758064 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Nov 24 06:46:32.758120 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Nov 24 06:46:32.758199 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Nov 24 06:46:32.758277 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Nov 24 06:46:32.758347 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Nov 24 06:46:32.758411 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Nov 24 06:46:32.758458 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Nov 24 06:46:32.758518 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Nov 24 06:46:32.758575 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Nov 24 06:46:32.758636 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Nov 24 06:46:32.759208 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Nov 24 06:46:32.761147 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Nov 24 06:46:32.761208 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Nov 24 06:46:32.761271 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Nov 24 06:46:32.761336 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Nov 24 06:46:32.761411 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Nov 24 06:46:32.761461 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Nov 24 06:46:32.761534 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Nov 24 06:46:32.761610 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Nov 24 06:46:32.761685 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Nov 24 06:46:32.761736 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Nov 24 06:46:32.761787 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Nov 24 06:46:32.761866 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Nov 24 06:46:32.761934 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Nov 24 06:46:32.761994 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Nov 24 06:46:32.762062 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Nov 24 06:46:32.762115 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Nov 24 06:46:32.762172 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Nov 24 06:46:32.762220 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Nov 24 06:46:32.762288 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Nov 24 06:46:32.762359 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Nov 24 06:46:32.762417 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Nov 24 06:46:32.762485 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Nov 24 06:46:32.762549 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Nov 24 06:46:32.762614 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Nov 24 06:46:32.763404 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Nov 24 06:46:32.763472 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Nov 24 06:46:32.763526 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Nov 24 06:46:32.763581 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Nov 24 06:46:32.763629 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Nov 24 06:46:32.763746 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Nov 24 06:46:32.763794 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Nov 24 06:46:32.763846 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Nov 24 06:46:32.763897 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Nov 24 06:46:32.763944 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Nov 24 06:46:32.763996 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Nov 24 06:46:32.764043 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Nov 24 06:46:32.764089 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Nov 24 06:46:32.764144 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Nov 24 06:46:32.764192 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Nov 24 06:46:32.764246 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Nov 24 06:46:32.764293 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Nov 24 06:46:32.764345 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Nov 24 06:46:32.764392 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Nov 24 06:46:32.764443 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Nov 24 06:46:32.764489 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Nov 24 06:46:32.764549 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Nov 24 06:46:32.764597 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Nov 24 06:46:32.764661 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Nov 24 06:46:32.765080 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Nov 24 06:46:32.765206 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Nov 24 06:46:32.765217 kernel: PCI: CLS 32 bytes, default 64 Nov 24 06:46:32.765224 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Nov 24 06:46:32.765233 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Nov 24 06:46:32.765239 kernel: clocksource: Switched to clocksource tsc Nov 24 06:46:32.765245 kernel: Initialise system trusted keyrings Nov 24 06:46:32.765251 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Nov 24 06:46:32.765257 kernel: Key type asymmetric registered Nov 24 06:46:32.765263 kernel: Asymmetric key parser 'x509' registered Nov 24 06:46:32.765269 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Nov 24 06:46:32.765275 kernel: io scheduler mq-deadline registered Nov 24 06:46:32.765281 kernel: io scheduler kyber registered Nov 24 06:46:32.765288 kernel: io scheduler bfq registered Nov 24 06:46:32.765345 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Nov 24 06:46:32.765399 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.765453 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Nov 24 06:46:32.765505 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.765557 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Nov 24 06:46:32.765609 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.765686 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Nov 24 06:46:32.766095 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.767716 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Nov 24 06:46:32.767783 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.767848 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Nov 24 06:46:32.767918 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.767986 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Nov 24 06:46:32.768041 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768099 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Nov 24 06:46:32.768151 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768203 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Nov 24 06:46:32.768255 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768308 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Nov 24 06:46:32.768360 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768416 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Nov 24 06:46:32.768467 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768522 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Nov 24 06:46:32.768575 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768683 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Nov 24 06:46:32.768770 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768834 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Nov 24 06:46:32.768895 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.768954 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Nov 24 06:46:32.769010 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769063 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Nov 24 06:46:32.769114 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769167 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Nov 24 06:46:32.769219 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769272 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Nov 24 06:46:32.769331 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769408 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Nov 24 06:46:32.769470 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769524 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Nov 24 06:46:32.769576 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769676 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Nov 24 06:46:32.769759 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769814 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Nov 24 06:46:32.769871 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.769937 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Nov 24 06:46:32.769990 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.770047 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Nov 24 06:46:32.770119 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.770185 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Nov 24 06:46:32.770238 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.770293 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Nov 24 06:46:32.770358 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.770412 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Nov 24 06:46:32.770463 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.770516 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Nov 24 06:46:32.770567 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.770619 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Nov 24 06:46:32.772092 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.772194 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Nov 24 06:46:32.772281 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.772347 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Nov 24 06:46:32.772411 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.772466 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Nov 24 06:46:32.772544 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:46:32.772558 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Nov 24 06:46:32.772564 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Nov 24 06:46:32.772571 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Nov 24 06:46:32.772577 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Nov 24 06:46:32.772584 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Nov 24 06:46:32.772590 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Nov 24 06:46:32.772688 kernel: rtc_cmos 00:01: registered as rtc0 Nov 24 06:46:32.772742 kernel: rtc_cmos 00:01: setting system clock to 2025-11-24T06:46:32 UTC (1763966792) Nov 24 06:46:32.772754 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Nov 24 06:46:32.772821 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Nov 24 06:46:32.772833 kernel: intel_pstate: CPU model not supported Nov 24 06:46:32.772844 kernel: NET: Registered PF_INET6 protocol family Nov 24 06:46:32.772850 kernel: Segment Routing with IPv6 Nov 24 06:46:32.772857 kernel: In-situ OAM (IOAM) with IPv6 Nov 24 06:46:32.772865 kernel: NET: Registered PF_PACKET protocol family Nov 24 06:46:32.772876 kernel: Key type dns_resolver registered Nov 24 06:46:32.772883 kernel: IPI shorthand broadcast: enabled Nov 24 06:46:32.772889 kernel: sched_clock: Marking stable (2693258878, 172837098)->(2882927223, -16831247) Nov 24 06:46:32.772898 kernel: registered taskstats version 1 Nov 24 06:46:32.772904 kernel: Loading compiled-in X.509 certificates Nov 24 06:46:32.772911 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.58-flatcar: 960cbe7f2b1ea74b5c881d6d42eea4d1ac19a607' Nov 24 06:46:32.772917 kernel: Demotion targets for Node 0: null Nov 24 06:46:32.772923 kernel: Key type .fscrypt registered Nov 24 06:46:32.772929 kernel: Key type fscrypt-provisioning registered Nov 24 06:46:32.772935 kernel: ima: No TPM chip found, activating TPM-bypass! Nov 24 06:46:32.772941 kernel: ima: Allocated hash algorithm: sha1 Nov 24 06:46:32.772949 kernel: ima: No architecture policies found Nov 24 06:46:32.772956 kernel: clk: Disabling unused clocks Nov 24 06:46:32.772962 kernel: Warning: unable to open an initial console. Nov 24 06:46:32.772968 kernel: Freeing unused kernel image (initmem) memory: 46200K Nov 24 06:46:32.772975 kernel: Write protecting the kernel read-only data: 40960k Nov 24 06:46:32.772981 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Nov 24 06:46:32.772987 kernel: Run /init as init process Nov 24 06:46:32.772994 kernel: with arguments: Nov 24 06:46:32.773000 kernel: /init Nov 24 06:46:32.773007 kernel: with environment: Nov 24 06:46:32.773013 kernel: HOME=/ Nov 24 06:46:32.773019 kernel: TERM=linux Nov 24 06:46:32.773026 systemd[1]: Successfully made /usr/ read-only. Nov 24 06:46:32.773036 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Nov 24 06:46:32.773043 systemd[1]: Detected virtualization vmware. Nov 24 06:46:32.773049 systemd[1]: Detected architecture x86-64. Nov 24 06:46:32.773057 systemd[1]: Running in initrd. Nov 24 06:46:32.773069 systemd[1]: No hostname configured, using default hostname. Nov 24 06:46:32.773081 systemd[1]: Hostname set to . Nov 24 06:46:32.773092 systemd[1]: Initializing machine ID from random generator. Nov 24 06:46:32.773099 systemd[1]: Queued start job for default target initrd.target. Nov 24 06:46:32.773106 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Nov 24 06:46:32.773112 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Nov 24 06:46:32.773119 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Nov 24 06:46:32.773126 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Nov 24 06:46:32.773138 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Nov 24 06:46:32.773151 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Nov 24 06:46:32.773166 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Nov 24 06:46:32.773174 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Nov 24 06:46:32.773181 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Nov 24 06:46:32.773187 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Nov 24 06:46:32.773194 systemd[1]: Reached target paths.target - Path Units. Nov 24 06:46:32.773207 systemd[1]: Reached target slices.target - Slice Units. Nov 24 06:46:32.773216 systemd[1]: Reached target swap.target - Swaps. Nov 24 06:46:32.773222 systemd[1]: Reached target timers.target - Timer Units. Nov 24 06:46:32.773229 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Nov 24 06:46:32.773235 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Nov 24 06:46:32.773242 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Nov 24 06:46:32.773249 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Nov 24 06:46:32.773255 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Nov 24 06:46:32.773263 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Nov 24 06:46:32.773270 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Nov 24 06:46:32.773276 systemd[1]: Reached target sockets.target - Socket Units. Nov 24 06:46:32.773286 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Nov 24 06:46:32.773296 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Nov 24 06:46:32.773307 systemd[1]: Finished network-cleanup.service - Network Cleanup. Nov 24 06:46:32.773317 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Nov 24 06:46:32.773329 systemd[1]: Starting systemd-fsck-usr.service... Nov 24 06:46:32.773340 systemd[1]: Starting systemd-journald.service - Journal Service... Nov 24 06:46:32.773355 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Nov 24 06:46:32.773367 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:46:32.773377 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Nov 24 06:46:32.773410 systemd-journald[225]: Collecting audit messages is disabled. Nov 24 06:46:32.773437 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Nov 24 06:46:32.773449 systemd[1]: Finished systemd-fsck-usr.service. Nov 24 06:46:32.773461 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Nov 24 06:46:32.773471 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Nov 24 06:46:32.773483 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Nov 24 06:46:32.773492 kernel: Bridge firewalling registered Nov 24 06:46:32.773501 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Nov 24 06:46:32.773511 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:46:32.773523 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Nov 24 06:46:32.773533 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Nov 24 06:46:32.773545 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Nov 24 06:46:32.773555 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Nov 24 06:46:32.773568 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Nov 24 06:46:32.773579 systemd-journald[225]: Journal started Nov 24 06:46:32.773601 systemd-journald[225]: Runtime Journal (/run/log/journal/d5c5e062021748d0b541b6e5db1e5b2f) is 4.8M, max 38.5M, 33.7M free. Nov 24 06:46:32.773652 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Nov 24 06:46:32.710028 systemd-modules-load[226]: Inserted module 'overlay' Nov 24 06:46:32.736840 systemd-modules-load[226]: Inserted module 'br_netfilter' Nov 24 06:46:32.776114 systemd[1]: Started systemd-journald.service - Journal Service. Nov 24 06:46:32.778442 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Nov 24 06:46:32.779816 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Nov 24 06:46:32.798498 systemd-tmpfiles[261]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Nov 24 06:46:32.800857 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Nov 24 06:46:32.802068 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Nov 24 06:46:32.803874 dracut-cmdline[260]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a5a093dfb613b73c778207057706f88d5254927e05ae90617f314b938bd34a14 Nov 24 06:46:32.836478 systemd-resolved[273]: Positive Trust Anchors: Nov 24 06:46:32.836488 systemd-resolved[273]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Nov 24 06:46:32.836512 systemd-resolved[273]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Nov 24 06:46:32.838797 systemd-resolved[273]: Defaulting to hostname 'linux'. Nov 24 06:46:32.839469 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Nov 24 06:46:32.839626 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Nov 24 06:46:32.869663 kernel: SCSI subsystem initialized Nov 24 06:46:32.889657 kernel: Loading iSCSI transport class v2.0-870. Nov 24 06:46:32.897650 kernel: iscsi: registered transport (tcp) Nov 24 06:46:32.919967 kernel: iscsi: registered transport (qla4xxx) Nov 24 06:46:32.920014 kernel: QLogic iSCSI HBA Driver Nov 24 06:46:32.931031 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Nov 24 06:46:32.945276 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Nov 24 06:46:32.946159 systemd[1]: Reached target network-pre.target - Preparation for Network. Nov 24 06:46:32.971624 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Nov 24 06:46:32.972534 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Nov 24 06:46:33.018687 kernel: raid6: avx2x4 gen() 46400 MB/s Nov 24 06:46:33.034688 kernel: raid6: avx2x2 gen() 51848 MB/s Nov 24 06:46:33.051969 kernel: raid6: avx2x1 gen() 44790 MB/s Nov 24 06:46:33.052001 kernel: raid6: using algorithm avx2x2 gen() 51848 MB/s Nov 24 06:46:33.069884 kernel: raid6: .... xor() 31710 MB/s, rmw enabled Nov 24 06:46:33.069932 kernel: raid6: using avx2x2 recovery algorithm Nov 24 06:46:33.086661 kernel: xor: automatically using best checksumming function avx Nov 24 06:46:33.203678 kernel: Btrfs loaded, zoned=no, fsverity=no Nov 24 06:46:33.208129 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Nov 24 06:46:33.209141 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Nov 24 06:46:33.232573 systemd-udevd[474]: Using default interface naming scheme 'v255'. Nov 24 06:46:33.236189 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Nov 24 06:46:33.237288 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Nov 24 06:46:33.254626 dracut-pre-trigger[479]: rd.md=0: removing MD RAID activation Nov 24 06:46:33.268583 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Nov 24 06:46:33.269499 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Nov 24 06:46:33.361945 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Nov 24 06:46:33.363708 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Nov 24 06:46:33.449652 kernel: VMware PVSCSI driver - version 1.0.7.0-k Nov 24 06:46:33.456658 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Nov 24 06:46:33.456686 kernel: vmw_pvscsi: using 64bit dma Nov 24 06:46:33.459612 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Nov 24 06:46:33.459740 kernel: vmw_pvscsi: max_id: 16 Nov 24 06:46:33.459750 kernel: vmw_pvscsi: setting ring_pages to 8 Nov 24 06:46:33.466653 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Nov 24 06:46:33.469830 kernel: vmw_pvscsi: enabling reqCallThreshold Nov 24 06:46:33.469867 kernel: vmw_pvscsi: driver-based request coalescing enabled Nov 24 06:46:33.469881 kernel: vmw_pvscsi: using MSI-X Nov 24 06:46:33.469889 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Nov 24 06:46:33.482662 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Nov 24 06:46:33.484655 kernel: cryptd: max_cpu_qlen set to 1000 Nov 24 06:46:33.491659 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Nov 24 06:46:33.495132 (udev-worker)[525]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Nov 24 06:46:33.495785 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Nov 24 06:46:33.501688 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Nov 24 06:46:33.502823 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Nov 24 06:46:33.502944 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:46:33.504811 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:46:33.505589 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:46:33.521658 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Nov 24 06:46:33.521791 kernel: libata version 3.00 loaded. Nov 24 06:46:33.521801 kernel: sd 0:0:0:0: [sda] Write Protect is off Nov 24 06:46:33.522661 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Nov 24 06:46:33.522740 kernel: sd 0:0:0:0: [sda] Cache data unavailable Nov 24 06:46:33.524116 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Nov 24 06:46:33.531662 kernel: AES CTR mode by8 optimization enabled Nov 24 06:46:33.541041 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:46:33.564661 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Nov 24 06:46:33.565658 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Nov 24 06:46:33.569655 kernel: ata_piix 0000:00:07.1: version 2.13 Nov 24 06:46:33.570651 kernel: scsi host1: ata_piix Nov 24 06:46:33.571654 kernel: scsi host2: ata_piix Nov 24 06:46:33.574518 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Nov 24 06:46:33.574538 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Nov 24 06:46:33.645584 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Nov 24 06:46:33.652247 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Nov 24 06:46:33.658916 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Nov 24 06:46:33.663963 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Nov 24 06:46:33.664326 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Nov 24 06:46:33.665234 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Nov 24 06:46:33.761667 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Nov 24 06:46:33.767722 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Nov 24 06:46:33.771658 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Nov 24 06:46:33.781663 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Nov 24 06:46:33.808653 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Nov 24 06:46:33.808801 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Nov 24 06:46:33.823660 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Nov 24 06:46:34.133727 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Nov 24 06:46:34.134068 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Nov 24 06:46:34.134196 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Nov 24 06:46:34.134389 systemd[1]: Reached target remote-fs.target - Remote File Systems. Nov 24 06:46:34.135032 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Nov 24 06:46:34.149561 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Nov 24 06:46:34.783709 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Nov 24 06:46:34.784203 disk-uuid[625]: The operation has completed successfully. Nov 24 06:46:34.817973 systemd[1]: disk-uuid.service: Deactivated successfully. Nov 24 06:46:34.818208 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Nov 24 06:46:34.833763 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Nov 24 06:46:34.846444 sh[657]: Success Nov 24 06:46:34.860008 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Nov 24 06:46:34.860046 kernel: device-mapper: uevent: version 1.0.3 Nov 24 06:46:34.861304 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Nov 24 06:46:34.868664 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Nov 24 06:46:34.944561 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Nov 24 06:46:34.945680 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Nov 24 06:46:34.953758 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Nov 24 06:46:34.983653 kernel: BTRFS: device fsid 3af95a3e-5df6-49e0-91e3-ddf2109f68c7 devid 1 transid 35 /dev/mapper/usr (254:0) scanned by mount (669) Nov 24 06:46:34.986283 kernel: BTRFS info (device dm-0): first mount of filesystem 3af95a3e-5df6-49e0-91e3-ddf2109f68c7 Nov 24 06:46:34.986301 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:46:34.994016 kernel: BTRFS info (device dm-0): enabling ssd optimizations Nov 24 06:46:34.994034 kernel: BTRFS info (device dm-0): disabling log replay at mount time Nov 24 06:46:34.994046 kernel: BTRFS info (device dm-0): enabling free space tree Nov 24 06:46:34.996798 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Nov 24 06:46:34.997248 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Nov 24 06:46:34.997927 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Nov 24 06:46:34.999693 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Nov 24 06:46:35.050663 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (692) Nov 24 06:46:35.053097 kernel: BTRFS info (device sda6): first mount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:46:35.053119 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:46:35.059234 kernel: BTRFS info (device sda6): enabling ssd optimizations Nov 24 06:46:35.059257 kernel: BTRFS info (device sda6): enabling free space tree Nov 24 06:46:35.062651 kernel: BTRFS info (device sda6): last unmount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:46:35.065838 systemd[1]: Finished ignition-setup.service - Ignition (setup). Nov 24 06:46:35.066744 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Nov 24 06:46:35.146040 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Nov 24 06:46:35.147068 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Nov 24 06:46:35.236272 ignition[711]: Ignition 2.22.0 Nov 24 06:46:35.236680 ignition[711]: Stage: fetch-offline Nov 24 06:46:35.237001 ignition[711]: no configs at "/usr/lib/ignition/base.d" Nov 24 06:46:35.237119 ignition[711]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:46:35.237292 ignition[711]: parsed url from cmdline: "" Nov 24 06:46:35.237319 ignition[711]: no config URL provided Nov 24 06:46:35.237626 ignition[711]: reading system config file "/usr/lib/ignition/user.ign" Nov 24 06:46:35.237799 ignition[711]: no config at "/usr/lib/ignition/user.ign" Nov 24 06:46:35.238273 ignition[711]: config successfully fetched Nov 24 06:46:35.238320 ignition[711]: parsing config with SHA512: aa0d2e2eeb714500e7e9bbccb76523d7e9fb800d355adef0a3169b566ac821203bbe7b9243648b9a9c4f86994350675702d2398610575ab8f2a900d0cd362c0f Nov 24 06:46:35.239609 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Nov 24 06:46:35.240803 systemd[1]: Starting systemd-networkd.service - Network Configuration... Nov 24 06:46:35.241455 unknown[711]: fetched base config from "system" Nov 24 06:46:35.241467 unknown[711]: fetched user config from "vmware" Nov 24 06:46:35.243187 ignition[711]: fetch-offline: fetch-offline passed Nov 24 06:46:35.243255 ignition[711]: Ignition finished successfully Nov 24 06:46:35.245833 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Nov 24 06:46:35.270525 systemd-networkd[850]: lo: Link UP Nov 24 06:46:35.270532 systemd-networkd[850]: lo: Gained carrier Nov 24 06:46:35.271305 systemd-networkd[850]: Enumeration completed Nov 24 06:46:35.271564 systemd-networkd[850]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Nov 24 06:46:35.271929 systemd[1]: Started systemd-networkd.service - Network Configuration. Nov 24 06:46:35.272100 systemd[1]: Reached target network.target - Network. Nov 24 06:46:35.274684 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Nov 24 06:46:35.274810 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Nov 24 06:46:35.272186 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Nov 24 06:46:35.272847 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Nov 24 06:46:35.274879 systemd-networkd[850]: ens192: Link UP Nov 24 06:46:35.274882 systemd-networkd[850]: ens192: Gained carrier Nov 24 06:46:35.289679 ignition[854]: Ignition 2.22.0 Nov 24 06:46:35.289939 ignition[854]: Stage: kargs Nov 24 06:46:35.290088 ignition[854]: no configs at "/usr/lib/ignition/base.d" Nov 24 06:46:35.290093 ignition[854]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:46:35.290548 ignition[854]: kargs: kargs passed Nov 24 06:46:35.290578 ignition[854]: Ignition finished successfully Nov 24 06:46:35.291714 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Nov 24 06:46:35.292565 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Nov 24 06:46:35.307583 ignition[861]: Ignition 2.22.0 Nov 24 06:46:35.307874 ignition[861]: Stage: disks Nov 24 06:46:35.308044 ignition[861]: no configs at "/usr/lib/ignition/base.d" Nov 24 06:46:35.308162 ignition[861]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:46:35.308738 ignition[861]: disks: disks passed Nov 24 06:46:35.308863 ignition[861]: Ignition finished successfully Nov 24 06:46:35.309663 systemd[1]: Finished ignition-disks.service - Ignition (disks). Nov 24 06:46:35.310019 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Nov 24 06:46:35.310157 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Nov 24 06:46:35.310349 systemd[1]: Reached target local-fs.target - Local File Systems. Nov 24 06:46:35.310541 systemd[1]: Reached target sysinit.target - System Initialization. Nov 24 06:46:35.310716 systemd[1]: Reached target basic.target - Basic System. Nov 24 06:46:35.311381 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Nov 24 06:46:35.359984 systemd-fsck[869]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Nov 24 06:46:35.361523 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Nov 24 06:46:35.362793 systemd[1]: Mounting sysroot.mount - /sysroot... Nov 24 06:46:35.453649 kernel: EXT4-fs (sda9): mounted filesystem f89e2a65-2a4a-426b-9659-02844cc29a2a r/w with ordered data mode. Quota mode: none. Nov 24 06:46:35.454019 systemd[1]: Mounted sysroot.mount - /sysroot. Nov 24 06:46:35.454491 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Nov 24 06:46:35.455594 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Nov 24 06:46:35.457675 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Nov 24 06:46:35.458933 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Nov 24 06:46:35.459145 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Nov 24 06:46:35.459366 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Nov 24 06:46:35.466827 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Nov 24 06:46:35.468041 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Nov 24 06:46:35.472757 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (877) Nov 24 06:46:35.475201 kernel: BTRFS info (device sda6): first mount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:46:35.475219 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:46:35.481130 kernel: BTRFS info (device sda6): enabling ssd optimizations Nov 24 06:46:35.481148 kernel: BTRFS info (device sda6): enabling free space tree Nov 24 06:46:35.482373 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Nov 24 06:46:35.510274 initrd-setup-root[902]: cut: /sysroot/etc/passwd: No such file or directory Nov 24 06:46:35.513386 initrd-setup-root[909]: cut: /sysroot/etc/group: No such file or directory Nov 24 06:46:35.515619 initrd-setup-root[916]: cut: /sysroot/etc/shadow: No such file or directory Nov 24 06:46:35.518456 initrd-setup-root[923]: cut: /sysroot/etc/gshadow: No such file or directory Nov 24 06:46:35.581988 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Nov 24 06:46:35.582650 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Nov 24 06:46:35.583712 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Nov 24 06:46:35.598995 kernel: BTRFS info (device sda6): last unmount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:46:35.613009 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Nov 24 06:46:35.616057 ignition[991]: INFO : Ignition 2.22.0 Nov 24 06:46:35.616057 ignition[991]: INFO : Stage: mount Nov 24 06:46:35.616373 ignition[991]: INFO : no configs at "/usr/lib/ignition/base.d" Nov 24 06:46:35.616373 ignition[991]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:46:35.616690 ignition[991]: INFO : mount: mount passed Nov 24 06:46:35.617360 ignition[991]: INFO : Ignition finished successfully Nov 24 06:46:35.617582 systemd[1]: Finished ignition-mount.service - Ignition (mount). Nov 24 06:46:35.618481 systemd[1]: Starting ignition-files.service - Ignition (files)... Nov 24 06:46:35.983269 systemd[1]: sysroot-oem.mount: Deactivated successfully. Nov 24 06:46:35.984767 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Nov 24 06:46:36.073659 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1003) Nov 24 06:46:36.084453 kernel: BTRFS info (device sda6): first mount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:46:36.084512 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:46:36.150680 kernel: BTRFS info (device sda6): enabling ssd optimizations Nov 24 06:46:36.150730 kernel: BTRFS info (device sda6): enabling free space tree Nov 24 06:46:36.153088 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Nov 24 06:46:36.180096 ignition[1020]: INFO : Ignition 2.22.0 Nov 24 06:46:36.180692 ignition[1020]: INFO : Stage: files Nov 24 06:46:36.180692 ignition[1020]: INFO : no configs at "/usr/lib/ignition/base.d" Nov 24 06:46:36.180692 ignition[1020]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:46:36.181462 ignition[1020]: DEBUG : files: compiled without relabeling support, skipping Nov 24 06:46:36.182096 ignition[1020]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Nov 24 06:46:36.182298 ignition[1020]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Nov 24 06:46:36.183885 ignition[1020]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Nov 24 06:46:36.184088 ignition[1020]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Nov 24 06:46:36.184336 unknown[1020]: wrote ssh authorized keys file for user: core Nov 24 06:46:36.184580 ignition[1020]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Nov 24 06:46:36.185982 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Nov 24 06:46:36.185982 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Nov 24 06:46:36.270834 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Nov 24 06:46:36.357807 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Nov 24 06:46:36.358303 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Nov 24 06:46:36.358303 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Nov 24 06:46:36.358303 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Nov 24 06:46:36.359149 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Nov 24 06:46:36.359149 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Nov 24 06:46:36.359149 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Nov 24 06:46:36.359149 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Nov 24 06:46:36.359149 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Nov 24 06:46:36.360212 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Nov 24 06:46:36.360212 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Nov 24 06:46:36.360212 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Nov 24 06:46:36.362590 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Nov 24 06:46:36.362590 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Nov 24 06:46:36.362590 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Nov 24 06:46:36.403734 systemd-networkd[850]: ens192: Gained IPv6LL Nov 24 06:46:36.784343 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Nov 24 06:46:37.041791 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Nov 24 06:46:37.041791 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Nov 24 06:46:37.042547 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Nov 24 06:46:37.042753 ignition[1020]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Nov 24 06:46:37.042895 ignition[1020]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Nov 24 06:46:37.043125 ignition[1020]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Nov 24 06:46:37.043125 ignition[1020]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Nov 24 06:46:37.043125 ignition[1020]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Nov 24 06:46:37.043595 ignition[1020]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Nov 24 06:46:37.043595 ignition[1020]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Nov 24 06:46:37.043595 ignition[1020]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Nov 24 06:46:37.043595 ignition[1020]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Nov 24 06:46:37.066739 ignition[1020]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Nov 24 06:46:37.068997 ignition[1020]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Nov 24 06:46:37.069210 ignition[1020]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Nov 24 06:46:37.069210 ignition[1020]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Nov 24 06:46:37.069210 ignition[1020]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Nov 24 06:46:37.069210 ignition[1020]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Nov 24 06:46:37.070606 ignition[1020]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Nov 24 06:46:37.070606 ignition[1020]: INFO : files: files passed Nov 24 06:46:37.070606 ignition[1020]: INFO : Ignition finished successfully Nov 24 06:46:37.071143 systemd[1]: Finished ignition-files.service - Ignition (files). Nov 24 06:46:37.072071 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Nov 24 06:46:37.072708 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Nov 24 06:46:37.089289 systemd[1]: ignition-quench.service: Deactivated successfully. Nov 24 06:46:37.089522 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Nov 24 06:46:37.091381 initrd-setup-root-after-ignition[1051]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Nov 24 06:46:37.091381 initrd-setup-root-after-ignition[1051]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Nov 24 06:46:37.092537 initrd-setup-root-after-ignition[1055]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Nov 24 06:46:37.093379 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Nov 24 06:46:37.093918 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Nov 24 06:46:37.094597 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Nov 24 06:46:37.121981 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Nov 24 06:46:37.122054 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Nov 24 06:46:37.122358 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Nov 24 06:46:37.122591 systemd[1]: Reached target initrd.target - Initrd Default Target. Nov 24 06:46:37.122831 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Nov 24 06:46:37.123326 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Nov 24 06:46:37.142251 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Nov 24 06:46:37.143305 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Nov 24 06:46:37.156290 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Nov 24 06:46:37.156460 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Nov 24 06:46:37.156699 systemd[1]: Stopped target timers.target - Timer Units. Nov 24 06:46:37.156894 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Nov 24 06:46:37.156965 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Nov 24 06:46:37.157350 systemd[1]: Stopped target initrd.target - Initrd Default Target. Nov 24 06:46:37.157498 systemd[1]: Stopped target basic.target - Basic System. Nov 24 06:46:37.157688 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Nov 24 06:46:37.157906 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Nov 24 06:46:37.158105 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Nov 24 06:46:37.158350 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Nov 24 06:46:37.158502 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Nov 24 06:46:37.158778 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Nov 24 06:46:37.159001 systemd[1]: Stopped target sysinit.target - System Initialization. Nov 24 06:46:37.159202 systemd[1]: Stopped target local-fs.target - Local File Systems. Nov 24 06:46:37.159390 systemd[1]: Stopped target swap.target - Swaps. Nov 24 06:46:37.159551 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Nov 24 06:46:37.159613 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Nov 24 06:46:37.159885 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Nov 24 06:46:37.160131 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Nov 24 06:46:37.160349 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Nov 24 06:46:37.160397 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Nov 24 06:46:37.160560 systemd[1]: dracut-initqueue.service: Deactivated successfully. Nov 24 06:46:37.160623 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Nov 24 06:46:37.160951 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Nov 24 06:46:37.161015 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Nov 24 06:46:37.161252 systemd[1]: Stopped target paths.target - Path Units. Nov 24 06:46:37.161396 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Nov 24 06:46:37.164662 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Nov 24 06:46:37.164840 systemd[1]: Stopped target slices.target - Slice Units. Nov 24 06:46:37.165070 systemd[1]: Stopped target sockets.target - Socket Units. Nov 24 06:46:37.165245 systemd[1]: iscsid.socket: Deactivated successfully. Nov 24 06:46:37.165305 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Nov 24 06:46:37.165451 systemd[1]: iscsiuio.socket: Deactivated successfully. Nov 24 06:46:37.165497 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Nov 24 06:46:37.165676 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Nov 24 06:46:37.165753 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Nov 24 06:46:37.166002 systemd[1]: ignition-files.service: Deactivated successfully. Nov 24 06:46:37.166061 systemd[1]: Stopped ignition-files.service - Ignition (files). Nov 24 06:46:37.166830 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Nov 24 06:46:37.166930 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Nov 24 06:46:37.166996 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Nov 24 06:46:37.167532 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Nov 24 06:46:37.168686 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Nov 24 06:46:37.168775 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Nov 24 06:46:37.168986 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Nov 24 06:46:37.169047 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Nov 24 06:46:37.172941 systemd[1]: initrd-cleanup.service: Deactivated successfully. Nov 24 06:46:37.174724 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Nov 24 06:46:37.181075 systemd[1]: sysroot-boot.mount: Deactivated successfully. Nov 24 06:46:37.185470 ignition[1075]: INFO : Ignition 2.22.0 Nov 24 06:46:37.185796 ignition[1075]: INFO : Stage: umount Nov 24 06:46:37.186617 ignition[1075]: INFO : no configs at "/usr/lib/ignition/base.d" Nov 24 06:46:37.186617 ignition[1075]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:46:37.186617 ignition[1075]: INFO : umount: umount passed Nov 24 06:46:37.186617 ignition[1075]: INFO : Ignition finished successfully Nov 24 06:46:37.188061 systemd[1]: ignition-mount.service: Deactivated successfully. Nov 24 06:46:37.188273 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Nov 24 06:46:37.188614 systemd[1]: Stopped target network.target - Network. Nov 24 06:46:37.188845 systemd[1]: ignition-disks.service: Deactivated successfully. Nov 24 06:46:37.188997 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Nov 24 06:46:37.189234 systemd[1]: ignition-kargs.service: Deactivated successfully. Nov 24 06:46:37.189378 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Nov 24 06:46:37.189617 systemd[1]: ignition-setup.service: Deactivated successfully. Nov 24 06:46:37.189772 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Nov 24 06:46:37.190028 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Nov 24 06:46:37.190157 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Nov 24 06:46:37.190597 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Nov 24 06:46:37.190902 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Nov 24 06:46:37.197676 systemd[1]: systemd-resolved.service: Deactivated successfully. Nov 24 06:46:37.197740 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Nov 24 06:46:37.199095 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Nov 24 06:46:37.199220 systemd[1]: systemd-networkd.service: Deactivated successfully. Nov 24 06:46:37.199282 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Nov 24 06:46:37.199985 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Nov 24 06:46:37.200402 systemd[1]: Stopped target network-pre.target - Preparation for Network. Nov 24 06:46:37.200537 systemd[1]: systemd-networkd.socket: Deactivated successfully. Nov 24 06:46:37.200557 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Nov 24 06:46:37.201276 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Nov 24 06:46:37.201365 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Nov 24 06:46:37.201390 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Nov 24 06:46:37.201667 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Nov 24 06:46:37.201689 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Nov 24 06:46:37.201896 systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 24 06:46:37.201918 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Nov 24 06:46:37.203322 systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 24 06:46:37.203347 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Nov 24 06:46:37.203449 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Nov 24 06:46:37.203470 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Nov 24 06:46:37.204329 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Nov 24 06:46:37.205302 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 24 06:46:37.205337 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Nov 24 06:46:37.214048 systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 24 06:46:37.214781 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Nov 24 06:46:37.215172 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Nov 24 06:46:37.215296 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Nov 24 06:46:37.215608 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Nov 24 06:46:37.215626 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Nov 24 06:46:37.215754 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Nov 24 06:46:37.215777 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Nov 24 06:46:37.215915 systemd[1]: dracut-cmdline.service: Deactivated successfully. Nov 24 06:46:37.215939 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Nov 24 06:46:37.216063 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Nov 24 06:46:37.216087 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Nov 24 06:46:37.217686 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Nov 24 06:46:37.217894 systemd[1]: systemd-network-generator.service: Deactivated successfully. Nov 24 06:46:37.217930 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Nov 24 06:46:37.218411 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Nov 24 06:46:37.218436 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Nov 24 06:46:37.218927 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Nov 24 06:46:37.218957 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:46:37.220327 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Nov 24 06:46:37.220519 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Nov 24 06:46:37.220544 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Nov 24 06:46:37.220704 systemd[1]: network-cleanup.service: Deactivated successfully. Nov 24 06:46:37.221708 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Nov 24 06:46:37.224998 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Nov 24 06:46:37.225190 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Nov 24 06:46:37.246538 systemd[1]: sysroot-boot.service: Deactivated successfully. Nov 24 06:46:37.246607 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Nov 24 06:46:37.246954 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Nov 24 06:46:37.247074 systemd[1]: initrd-setup-root.service: Deactivated successfully. Nov 24 06:46:37.247118 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Nov 24 06:46:37.247737 systemd[1]: Starting initrd-switch-root.service - Switch Root... Nov 24 06:46:37.264024 systemd[1]: Switching root. Nov 24 06:46:37.293419 systemd-journald[225]: Journal stopped Nov 24 06:46:38.565283 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Nov 24 06:46:38.565308 kernel: SELinux: policy capability network_peer_controls=1 Nov 24 06:46:38.565316 kernel: SELinux: policy capability open_perms=1 Nov 24 06:46:38.565322 kernel: SELinux: policy capability extended_socket_class=1 Nov 24 06:46:38.565330 kernel: SELinux: policy capability always_check_network=0 Nov 24 06:46:38.565338 kernel: SELinux: policy capability cgroup_seclabel=1 Nov 24 06:46:38.565345 kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 24 06:46:38.565352 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Nov 24 06:46:38.565358 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Nov 24 06:46:38.565363 kernel: SELinux: policy capability userspace_initial_context=0 Nov 24 06:46:38.565369 kernel: audit: type=1403 audit(1763966797.928:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Nov 24 06:46:38.565375 systemd[1]: Successfully loaded SELinux policy in 58.838ms. Nov 24 06:46:38.565382 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.804ms. Nov 24 06:46:38.565390 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Nov 24 06:46:38.565398 systemd[1]: Detected virtualization vmware. Nov 24 06:46:38.565405 systemd[1]: Detected architecture x86-64. Nov 24 06:46:38.565411 systemd[1]: Detected first boot. Nov 24 06:46:38.565418 systemd[1]: Initializing machine ID from random generator. Nov 24 06:46:38.565425 zram_generator::config[1120]: No configuration found. Nov 24 06:46:38.565513 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Nov 24 06:46:38.565524 kernel: Guest personality initialized and is active Nov 24 06:46:38.565531 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Nov 24 06:46:38.565536 kernel: Initialized host personality Nov 24 06:46:38.565543 kernel: NET: Registered PF_VSOCK protocol family Nov 24 06:46:38.565551 systemd[1]: Populated /etc with preset unit settings. Nov 24 06:46:38.565559 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:46:38.565566 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Nov 24 06:46:38.565573 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Nov 24 06:46:38.565580 systemd[1]: initrd-switch-root.service: Deactivated successfully. Nov 24 06:46:38.565591 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Nov 24 06:46:38.565599 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Nov 24 06:46:38.565608 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Nov 24 06:46:38.565615 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Nov 24 06:46:38.565621 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Nov 24 06:46:38.565628 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Nov 24 06:46:38.565635 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Nov 24 06:46:38.565652 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Nov 24 06:46:38.565660 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Nov 24 06:46:38.565666 systemd[1]: Created slice user.slice - User and Session Slice. Nov 24 06:46:38.565675 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Nov 24 06:46:38.565683 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Nov 24 06:46:38.565690 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Nov 24 06:46:38.565697 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Nov 24 06:46:38.565704 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Nov 24 06:46:38.565711 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Nov 24 06:46:38.565718 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Nov 24 06:46:38.565725 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Nov 24 06:46:38.565733 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Nov 24 06:46:38.565743 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Nov 24 06:46:38.565752 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Nov 24 06:46:38.565760 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Nov 24 06:46:38.565767 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Nov 24 06:46:38.565773 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Nov 24 06:46:38.565780 systemd[1]: Reached target remote-fs.target - Remote File Systems. Nov 24 06:46:38.565787 systemd[1]: Reached target slices.target - Slice Units. Nov 24 06:46:38.565795 systemd[1]: Reached target swap.target - Swaps. Nov 24 06:46:38.565802 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Nov 24 06:46:38.565809 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Nov 24 06:46:38.565817 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Nov 24 06:46:38.565824 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Nov 24 06:46:38.565835 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Nov 24 06:46:38.565843 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Nov 24 06:46:38.565851 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Nov 24 06:46:38.565858 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Nov 24 06:46:38.565865 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Nov 24 06:46:38.565872 systemd[1]: Mounting media.mount - External Media Directory... Nov 24 06:46:38.565879 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:38.565886 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Nov 24 06:46:38.565895 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Nov 24 06:46:38.565902 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Nov 24 06:46:38.565909 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Nov 24 06:46:38.565916 systemd[1]: Reached target machines.target - Containers. Nov 24 06:46:38.565923 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Nov 24 06:46:38.565930 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Nov 24 06:46:38.565938 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Nov 24 06:46:38.565945 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Nov 24 06:46:38.565953 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Nov 24 06:46:38.565960 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Nov 24 06:46:38.565968 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Nov 24 06:46:38.565975 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Nov 24 06:46:38.565982 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Nov 24 06:46:38.565989 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Nov 24 06:46:38.565996 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Nov 24 06:46:38.566004 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Nov 24 06:46:38.566015 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Nov 24 06:46:38.566024 systemd[1]: Stopped systemd-fsck-usr.service. Nov 24 06:46:38.566032 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:46:38.566039 systemd[1]: Starting systemd-journald.service - Journal Service... Nov 24 06:46:38.566046 kernel: fuse: init (API version 7.41) Nov 24 06:46:38.566053 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Nov 24 06:46:38.566060 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Nov 24 06:46:38.566067 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Nov 24 06:46:38.566074 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Nov 24 06:46:38.566082 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Nov 24 06:46:38.566089 systemd[1]: verity-setup.service: Deactivated successfully. Nov 24 06:46:38.566099 systemd[1]: Stopped verity-setup.service. Nov 24 06:46:38.566109 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:38.566116 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Nov 24 06:46:38.566123 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Nov 24 06:46:38.566130 systemd[1]: Mounted media.mount - External Media Directory. Nov 24 06:46:38.566137 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Nov 24 06:46:38.566144 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Nov 24 06:46:38.566153 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Nov 24 06:46:38.566161 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Nov 24 06:46:38.566168 systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 24 06:46:38.566175 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Nov 24 06:46:38.566181 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Nov 24 06:46:38.566189 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Nov 24 06:46:38.566196 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Nov 24 06:46:38.566203 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Nov 24 06:46:38.566211 systemd[1]: modprobe@fuse.service: Deactivated successfully. Nov 24 06:46:38.566218 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Nov 24 06:46:38.566239 systemd-journald[1213]: Collecting audit messages is disabled. Nov 24 06:46:38.566256 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Nov 24 06:46:38.566265 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Nov 24 06:46:38.566274 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Nov 24 06:46:38.566285 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Nov 24 06:46:38.566293 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Nov 24 06:46:38.566303 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Nov 24 06:46:38.566312 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Nov 24 06:46:38.566319 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Nov 24 06:46:38.566326 systemd[1]: Reached target network-pre.target - Preparation for Network. Nov 24 06:46:38.566333 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Nov 24 06:46:38.566342 systemd[1]: Reached target local-fs.target - Local File Systems. Nov 24 06:46:38.566349 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Nov 24 06:46:38.566357 systemd-journald[1213]: Journal started Nov 24 06:46:38.566376 systemd-journald[1213]: Runtime Journal (/run/log/journal/72e3dd5537504383bf84d214fb2b1670) is 4.8M, max 38.5M, 33.7M free. Nov 24 06:46:38.369042 systemd[1]: Queued start job for default target multi-user.target. Nov 24 06:46:38.388988 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Nov 24 06:46:38.569827 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Nov 24 06:46:38.569855 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:46:38.389233 systemd[1]: systemd-journald.service: Deactivated successfully. Nov 24 06:46:38.570205 jq[1190]: true Nov 24 06:46:38.570697 jq[1222]: true Nov 24 06:46:38.578230 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Nov 24 06:46:38.578267 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Nov 24 06:46:38.581780 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Nov 24 06:46:38.590650 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Nov 24 06:46:38.591658 systemd[1]: Started systemd-journald.service - Journal Service. Nov 24 06:46:38.592785 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Nov 24 06:46:38.595652 kernel: loop: module loaded Nov 24 06:46:38.606071 systemd[1]: modprobe@loop.service: Deactivated successfully. Nov 24 06:46:38.611931 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Nov 24 06:46:38.612910 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Nov 24 06:46:38.625257 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Nov 24 06:46:38.625742 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Nov 24 06:46:38.632048 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Nov 24 06:46:38.633833 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Nov 24 06:46:38.633970 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Nov 24 06:46:38.638960 systemd[1]: Starting systemd-sysusers.service - Create System Users... Nov 24 06:46:38.639873 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Nov 24 06:46:38.649359 kernel: ACPI: bus type drm_connector registered Nov 24 06:46:38.646852 systemd[1]: modprobe@drm.service: Deactivated successfully. Nov 24 06:46:38.646985 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Nov 24 06:46:38.650140 ignition[1231]: Ignition 2.22.0 Nov 24 06:46:38.650354 ignition[1231]: deleting config from guestinfo properties Nov 24 06:46:38.707961 ignition[1231]: Successfully deleted config Nov 24 06:46:38.714203 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Nov 24 06:46:38.719651 kernel: loop0: detected capacity change from 0 to 110984 Nov 24 06:46:38.724822 systemd-journald[1213]: Time spent on flushing to /var/log/journal/72e3dd5537504383bf84d214fb2b1670 is 36.801ms for 1770 entries. Nov 24 06:46:38.724822 systemd-journald[1213]: System Journal (/var/log/journal/72e3dd5537504383bf84d214fb2b1670) is 8M, max 584.8M, 576.8M free. Nov 24 06:46:38.770180 systemd-journald[1213]: Received client request to flush runtime journal. Nov 24 06:46:38.770209 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Nov 24 06:46:38.730050 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Nov 24 06:46:38.731509 systemd[1]: Finished systemd-sysusers.service - Create System Users. Nov 24 06:46:38.734607 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Nov 24 06:46:38.761097 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Nov 24 06:46:38.769127 systemd-tmpfiles[1283]: ACLs are not supported, ignoring. Nov 24 06:46:38.769136 systemd-tmpfiles[1283]: ACLs are not supported, ignoring. Nov 24 06:46:38.770978 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Nov 24 06:46:38.775348 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Nov 24 06:46:38.776835 kernel: loop1: detected capacity change from 0 to 2960 Nov 24 06:46:38.803913 kernel: loop2: detected capacity change from 0 to 224512 Nov 24 06:46:38.842657 kernel: loop3: detected capacity change from 0 to 128560 Nov 24 06:46:38.882656 kernel: loop4: detected capacity change from 0 to 110984 Nov 24 06:46:38.902655 kernel: loop5: detected capacity change from 0 to 2960 Nov 24 06:46:38.920676 kernel: loop6: detected capacity change from 0 to 224512 Nov 24 06:46:38.984666 kernel: loop7: detected capacity change from 0 to 128560 Nov 24 06:46:39.004439 (sd-merge)[1295]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Nov 24 06:46:39.005016 (sd-merge)[1295]: Merged extensions into '/usr'. Nov 24 06:46:39.010368 systemd[1]: Reload requested from client PID 1243 ('systemd-sysext') (unit systemd-sysext.service)... Nov 24 06:46:39.010425 systemd[1]: Reloading... Nov 24 06:46:39.056664 zram_generator::config[1317]: No configuration found. Nov 24 06:46:39.183253 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:46:39.230974 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Nov 24 06:46:39.231133 systemd[1]: Reloading finished in 220 ms. Nov 24 06:46:39.246311 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Nov 24 06:46:39.246999 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Nov 24 06:46:39.258087 systemd[1]: Starting ensure-sysext.service... Nov 24 06:46:39.258963 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Nov 24 06:46:39.261764 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Nov 24 06:46:39.271128 systemd[1]: Reload requested from client PID 1377 ('systemctl') (unit ensure-sysext.service)... Nov 24 06:46:39.271143 systemd[1]: Reloading... Nov 24 06:46:39.277522 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Nov 24 06:46:39.277540 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Nov 24 06:46:39.277726 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Nov 24 06:46:39.277884 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Nov 24 06:46:39.278356 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Nov 24 06:46:39.278516 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. Nov 24 06:46:39.278548 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. Nov 24 06:46:39.280590 systemd-tmpfiles[1378]: Detected autofs mount point /boot during canonicalization of boot. Nov 24 06:46:39.280596 systemd-tmpfiles[1378]: Skipping /boot Nov 24 06:46:39.287041 systemd-tmpfiles[1378]: Detected autofs mount point /boot during canonicalization of boot. Nov 24 06:46:39.287047 systemd-tmpfiles[1378]: Skipping /boot Nov 24 06:46:39.299817 systemd-udevd[1379]: Using default interface naming scheme 'v255'. Nov 24 06:46:39.317664 zram_generator::config[1402]: No configuration found. Nov 24 06:46:39.344534 ldconfig[1232]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Nov 24 06:46:39.471995 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:46:39.485658 kernel: mousedev: PS/2 mouse device common for all mice Nov 24 06:46:39.489653 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Nov 24 06:46:39.512660 kernel: ACPI: button: Power Button [PWRF] Nov 24 06:46:39.528516 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Nov 24 06:46:39.528813 systemd[1]: Reloading finished in 257 ms. Nov 24 06:46:39.537351 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Nov 24 06:46:39.537726 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Nov 24 06:46:39.541404 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Nov 24 06:46:39.562419 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Nov 24 06:46:39.568028 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:39.569836 systemd[1]: Starting audit-rules.service - Load Audit Rules... Nov 24 06:46:39.573127 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Nov 24 06:46:39.575099 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Nov 24 06:46:39.578306 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Nov 24 06:46:39.581627 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Nov 24 06:46:39.581839 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:46:39.585065 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Nov 24 06:46:39.585408 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:46:39.588807 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Nov 24 06:46:39.593540 systemd[1]: Starting systemd-networkd.service - Network Configuration... Nov 24 06:46:39.595384 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Nov 24 06:46:39.598803 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Nov 24 06:46:39.598926 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:39.599978 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Nov 24 06:46:39.600402 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Nov 24 06:46:39.600990 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Nov 24 06:46:39.601356 systemd[1]: modprobe@loop.service: Deactivated successfully. Nov 24 06:46:39.601745 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Nov 24 06:46:39.605434 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:39.613934 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Nov 24 06:46:39.618210 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Nov 24 06:46:39.618358 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:46:39.618420 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:46:39.618479 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:39.619024 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Nov 24 06:46:39.619767 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Nov 24 06:46:39.624124 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:39.628460 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Nov 24 06:46:39.628667 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Nov 24 06:46:39.637723 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Nov 24 06:46:39.638077 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:46:39.638149 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:46:39.640988 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Nov 24 06:46:39.641139 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:46:39.642050 systemd[1]: modprobe@drm.service: Deactivated successfully. Nov 24 06:46:39.642278 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Nov 24 06:46:39.647004 systemd[1]: Finished ensure-sysext.service. Nov 24 06:46:39.652700 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Nov 24 06:46:39.653141 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Nov 24 06:46:39.653399 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Nov 24 06:46:39.653502 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Nov 24 06:46:39.655145 systemd[1]: modprobe@loop.service: Deactivated successfully. Nov 24 06:46:39.655264 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Nov 24 06:46:39.655760 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Nov 24 06:46:39.663942 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Nov 24 06:46:39.667095 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Nov 24 06:46:39.669549 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Nov 24 06:46:39.677374 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Nov 24 06:46:39.679335 systemd[1]: Starting systemd-update-done.service - Update is Completed... Nov 24 06:46:39.696623 augenrules[1548]: No rules Nov 24 06:46:39.698177 systemd[1]: audit-rules.service: Deactivated successfully. Nov 24 06:46:39.698690 systemd[1]: Finished audit-rules.service - Load Audit Rules. Nov 24 06:46:39.700943 systemd[1]: Finished systemd-update-done.service - Update is Completed. Nov 24 06:46:39.710940 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Nov 24 06:46:39.711231 systemd[1]: Started systemd-userdbd.service - User Database Manager. Nov 24 06:46:39.711556 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Nov 24 06:46:39.763038 (udev-worker)[1417]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Nov 24 06:46:39.785929 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:46:39.792380 systemd-networkd[1508]: lo: Link UP Nov 24 06:46:39.793656 systemd-networkd[1508]: lo: Gained carrier Nov 24 06:46:39.794467 systemd-networkd[1508]: Enumeration completed Nov 24 06:46:39.794534 systemd[1]: Started systemd-networkd.service - Network Configuration. Nov 24 06:46:39.794814 systemd-networkd[1508]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Nov 24 06:46:39.797653 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Nov 24 06:46:39.797775 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Nov 24 06:46:39.798011 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Nov 24 06:46:39.799962 systemd-networkd[1508]: ens192: Link UP Nov 24 06:46:39.800048 systemd-networkd[1508]: ens192: Gained carrier Nov 24 06:46:39.801566 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Nov 24 06:46:39.817830 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Nov 24 06:46:39.817990 systemd[1]: Reached target time-set.target - System Time Set. Nov 24 06:46:39.855381 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Nov 24 06:46:39.860632 systemd-resolved[1510]: Positive Trust Anchors: Nov 24 06:46:39.860669 systemd-resolved[1510]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Nov 24 06:46:39.860693 systemd-resolved[1510]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Nov 24 06:46:39.871000 systemd-resolved[1510]: Defaulting to hostname 'linux'. Nov 24 06:46:39.871935 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Nov 24 06:46:39.872093 systemd[1]: Reached target network.target - Network. Nov 24 06:46:39.872184 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Nov 24 06:46:39.899811 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:46:39.900254 systemd[1]: Reached target sysinit.target - System Initialization. Nov 24 06:46:39.900426 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Nov 24 06:46:39.900561 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Nov 24 06:46:39.900692 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Nov 24 06:46:39.900881 systemd[1]: Started logrotate.timer - Daily rotation of log files. Nov 24 06:46:39.901044 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Nov 24 06:46:39.901162 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Nov 24 06:46:39.901277 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Nov 24 06:46:39.901296 systemd[1]: Reached target paths.target - Path Units. Nov 24 06:46:39.901392 systemd[1]: Reached target timers.target - Timer Units. Nov 24 06:46:39.902146 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Nov 24 06:46:39.903127 systemd[1]: Starting docker.socket - Docker Socket for the API... Nov 24 06:46:39.904425 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Nov 24 06:46:39.904611 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Nov 24 06:46:39.904855 systemd[1]: Reached target ssh-access.target - SSH Access Available. Nov 24 06:46:39.906566 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Nov 24 06:46:39.906837 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Nov 24 06:46:39.907297 systemd[1]: Listening on docker.socket - Docker Socket for the API. Nov 24 06:46:39.907805 systemd[1]: Reached target sockets.target - Socket Units. Nov 24 06:46:39.907908 systemd[1]: Reached target basic.target - Basic System. Nov 24 06:46:39.908031 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Nov 24 06:46:39.908048 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Nov 24 06:46:39.908690 systemd[1]: Starting containerd.service - containerd container runtime... Nov 24 06:46:39.910699 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Nov 24 06:46:39.912108 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Nov 24 06:46:39.914302 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Nov 24 06:46:39.915062 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Nov 24 06:48:19.971820 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Nov 24 06:48:19.971917 systemd-resolved[1510]: Clock change detected. Flushing caches. Nov 24 06:48:19.974483 systemd-timesyncd[1533]: Contacted time server 172.232.15.202:123 (0.flatcar.pool.ntp.org). Nov 24 06:48:19.974542 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Nov 24 06:48:19.975219 systemd-timesyncd[1533]: Initial clock synchronization to Mon 2025-11-24 06:48:19.971786 UTC. Nov 24 06:48:19.975340 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Nov 24 06:48:19.978017 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Nov 24 06:48:19.978999 jq[1595]: false Nov 24 06:48:19.981798 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Nov 24 06:48:19.983337 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Nov 24 06:48:19.988255 google_oslogin_nss_cache[1597]: oslogin_cache_refresh[1597]: Refreshing passwd entry cache Nov 24 06:48:19.988157 systemd[1]: Starting systemd-logind.service - User Login Management... Nov 24 06:48:19.988468 oslogin_cache_refresh[1597]: Refreshing passwd entry cache Nov 24 06:48:19.989714 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Nov 24 06:48:19.993490 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Nov 24 06:48:19.993789 systemd[1]: Starting update-engine.service - Update Engine... Nov 24 06:48:19.997625 google_oslogin_nss_cache[1597]: oslogin_cache_refresh[1597]: Failure getting users, quitting Nov 24 06:48:19.997625 google_oslogin_nss_cache[1597]: oslogin_cache_refresh[1597]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Nov 24 06:48:19.997625 google_oslogin_nss_cache[1597]: oslogin_cache_refresh[1597]: Refreshing group entry cache Nov 24 06:48:19.997352 oslogin_cache_refresh[1597]: Failure getting users, quitting Nov 24 06:48:19.997362 oslogin_cache_refresh[1597]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Nov 24 06:48:19.997388 oslogin_cache_refresh[1597]: Refreshing group entry cache Nov 24 06:48:19.997898 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Nov 24 06:48:20.000263 google_oslogin_nss_cache[1597]: oslogin_cache_refresh[1597]: Failure getting groups, quitting Nov 24 06:48:20.000263 google_oslogin_nss_cache[1597]: oslogin_cache_refresh[1597]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Nov 24 06:48:20.000208 oslogin_cache_refresh[1597]: Failure getting groups, quitting Nov 24 06:48:20.000214 oslogin_cache_refresh[1597]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Nov 24 06:48:20.001234 extend-filesystems[1596]: Found /dev/sda6 Nov 24 06:48:20.001236 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Nov 24 06:48:20.005689 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Nov 24 06:48:20.005944 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Nov 24 06:48:20.006072 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Nov 24 06:48:20.006214 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Nov 24 06:48:20.006364 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Nov 24 06:48:20.009581 extend-filesystems[1596]: Found /dev/sda9 Nov 24 06:48:20.010206 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Nov 24 06:48:20.012821 extend-filesystems[1596]: Checking size of /dev/sda9 Nov 24 06:48:20.014714 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Nov 24 06:48:20.015007 systemd[1]: motdgen.service: Deactivated successfully. Nov 24 06:48:20.015131 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Nov 24 06:48:20.023432 jq[1609]: true Nov 24 06:48:20.033608 (ntainerd)[1631]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Nov 24 06:48:20.038325 update_engine[1608]: I20251124 06:48:20.035540 1608 main.cc:92] Flatcar Update Engine starting Nov 24 06:48:20.043662 jq[1633]: true Nov 24 06:48:20.052396 tar[1620]: linux-amd64/LICENSE Nov 24 06:48:20.052396 tar[1620]: linux-amd64/helm Nov 24 06:48:20.052989 extend-filesystems[1596]: Old size kept for /dev/sda9 Nov 24 06:48:20.053521 systemd[1]: extend-filesystems.service: Deactivated successfully. Nov 24 06:48:20.053686 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Nov 24 06:48:20.059930 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Nov 24 06:48:20.063311 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Nov 24 06:48:20.071793 dbus-daemon[1593]: [system] SELinux support is enabled Nov 24 06:48:20.071877 systemd[1]: Started dbus.service - D-Bus System Message Bus. Nov 24 06:48:20.074880 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Nov 24 06:48:20.074896 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Nov 24 06:48:20.075020 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Nov 24 06:48:20.075029 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Nov 24 06:48:20.086617 systemd[1]: Started update-engine.service - Update Engine. Nov 24 06:48:20.087687 update_engine[1608]: I20251124 06:48:20.087656 1608 update_check_scheduler.cc:74] Next update check in 3m9s Nov 24 06:48:20.100113 systemd[1]: Started locksmithd.service - Cluster reboot manager. Nov 24 06:48:20.108116 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Nov 24 06:48:20.117978 systemd-logind[1602]: Watching system buttons on /dev/input/event2 (Power Button) Nov 24 06:48:20.117992 systemd-logind[1602]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Nov 24 06:48:20.118139 systemd-logind[1602]: New seat seat0. Nov 24 06:48:20.119924 systemd[1]: Started systemd-logind.service - User Login Management. Nov 24 06:48:20.128236 unknown[1643]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Nov 24 06:48:20.130519 unknown[1643]: Core dump limit set to -1 Nov 24 06:48:20.151726 bash[1663]: Updated "/home/core/.ssh/authorized_keys" Nov 24 06:48:20.152622 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Nov 24 06:48:20.153193 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Nov 24 06:48:20.247492 locksmithd[1651]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Nov 24 06:48:20.272793 sshd_keygen[1619]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Nov 24 06:48:20.278123 containerd[1631]: time="2025-11-24T06:48:20Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Nov 24 06:48:20.279522 containerd[1631]: time="2025-11-24T06:48:20.279457410Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300290373Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.289µs" Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300313890Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300325487Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300415568Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300424473Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300438936Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300471940Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300478812Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300613058Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300624895Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300630950Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301049 containerd[1631]: time="2025-11-24T06:48:20.300635790Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301284 containerd[1631]: time="2025-11-24T06:48:20.300676787Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301284 containerd[1631]: time="2025-11-24T06:48:20.300786362Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301284 containerd[1631]: time="2025-11-24T06:48:20.300801998Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Nov 24 06:48:20.301284 containerd[1631]: time="2025-11-24T06:48:20.300807608Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Nov 24 06:48:20.301284 containerd[1631]: time="2025-11-24T06:48:20.300821361Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Nov 24 06:48:20.301284 containerd[1631]: time="2025-11-24T06:48:20.300939994Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Nov 24 06:48:20.301284 containerd[1631]: time="2025-11-24T06:48:20.300969767Z" level=info msg="metadata content store policy set" policy=shared Nov 24 06:48:20.306523 containerd[1631]: time="2025-11-24T06:48:20.306504239Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Nov 24 06:48:20.306868 containerd[1631]: time="2025-11-24T06:48:20.306853868Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Nov 24 06:48:20.307006 containerd[1631]: time="2025-11-24T06:48:20.306998105Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Nov 24 06:48:20.307043 containerd[1631]: time="2025-11-24T06:48:20.307036121Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Nov 24 06:48:20.307082 containerd[1631]: time="2025-11-24T06:48:20.307074224Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Nov 24 06:48:20.307112 containerd[1631]: time="2025-11-24T06:48:20.307106212Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Nov 24 06:48:20.307148 containerd[1631]: time="2025-11-24T06:48:20.307140839Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Nov 24 06:48:20.307216 containerd[1631]: time="2025-11-24T06:48:20.307208227Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Nov 24 06:48:20.307321 containerd[1631]: time="2025-11-24T06:48:20.307313210Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Nov 24 06:48:20.307354 containerd[1631]: time="2025-11-24T06:48:20.307347613Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Nov 24 06:48:20.307413 containerd[1631]: time="2025-11-24T06:48:20.307405518Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Nov 24 06:48:20.307443 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Nov 24 06:48:20.307510 containerd[1631]: time="2025-11-24T06:48:20.307501583Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Nov 24 06:48:20.307947 containerd[1631]: time="2025-11-24T06:48:20.307936442Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Nov 24 06:48:20.308124 containerd[1631]: time="2025-11-24T06:48:20.308115750Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Nov 24 06:48:20.308166 containerd[1631]: time="2025-11-24T06:48:20.308158457Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Nov 24 06:48:20.308289 containerd[1631]: time="2025-11-24T06:48:20.308279107Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Nov 24 06:48:20.308337 containerd[1631]: time="2025-11-24T06:48:20.308326567Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Nov 24 06:48:20.308456 containerd[1631]: time="2025-11-24T06:48:20.308448735Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Nov 24 06:48:20.308491 containerd[1631]: time="2025-11-24T06:48:20.308484335Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Nov 24 06:48:20.308528 containerd[1631]: time="2025-11-24T06:48:20.308520528Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Nov 24 06:48:20.309670 containerd[1631]: time="2025-11-24T06:48:20.308556860Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Nov 24 06:48:20.309670 containerd[1631]: time="2025-11-24T06:48:20.308566545Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Nov 24 06:48:20.309670 containerd[1631]: time="2025-11-24T06:48:20.308575275Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Nov 24 06:48:20.309670 containerd[1631]: time="2025-11-24T06:48:20.308605515Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Nov 24 06:48:20.309670 containerd[1631]: time="2025-11-24T06:48:20.308613454Z" level=info msg="Start snapshots syncer" Nov 24 06:48:20.309670 containerd[1631]: time="2025-11-24T06:48:20.308629123Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Nov 24 06:48:20.309122 systemd[1]: Starting issuegen.service - Generate /run/issue... Nov 24 06:48:20.309794 containerd[1631]: time="2025-11-24T06:48:20.308786240Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Nov 24 06:48:20.309794 containerd[1631]: time="2025-11-24T06:48:20.308817494Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308842085Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308894443Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308906523Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308912348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308918108Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308926642Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308932949Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308938587Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308951184Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308957221Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308966973Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308982744Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308991412Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Nov 24 06:48:20.309872 containerd[1631]: time="2025-11-24T06:48:20.308996175Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309001399Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309005536Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309010372Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309018293Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309027536Z" level=info msg="runtime interface created" Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309030554Z" level=info msg="created NRI interface" Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309034818Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309040390Z" level=info msg="Connect containerd service" Nov 24 06:48:20.310046 containerd[1631]: time="2025-11-24T06:48:20.309051448Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Nov 24 06:48:20.311542 containerd[1631]: time="2025-11-24T06:48:20.311214278Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Nov 24 06:48:20.338361 systemd[1]: issuegen.service: Deactivated successfully. Nov 24 06:48:20.338512 systemd[1]: Finished issuegen.service - Generate /run/issue. Nov 24 06:48:20.341013 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Nov 24 06:48:20.368031 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Nov 24 06:48:20.370506 systemd[1]: Started getty@tty1.service - Getty on tty1. Nov 24 06:48:20.371858 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Nov 24 06:48:20.373340 systemd[1]: Reached target getty.target - Login Prompts. Nov 24 06:48:20.426189 containerd[1631]: time="2025-11-24T06:48:20.426158702Z" level=info msg="Start subscribing containerd event" Nov 24 06:48:20.426836 containerd[1631]: time="2025-11-24T06:48:20.426817843Z" level=info msg="Start recovering state" Nov 24 06:48:20.427014 containerd[1631]: time="2025-11-24T06:48:20.427000045Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Nov 24 06:48:20.427085 containerd[1631]: time="2025-11-24T06:48:20.427076865Z" level=info msg=serving... address=/run/containerd/containerd.sock Nov 24 06:48:20.427135 containerd[1631]: time="2025-11-24T06:48:20.427040858Z" level=info msg="Start event monitor" Nov 24 06:48:20.427172 containerd[1631]: time="2025-11-24T06:48:20.427163494Z" level=info msg="Start cni network conf syncer for default" Nov 24 06:48:20.427237 containerd[1631]: time="2025-11-24T06:48:20.427216291Z" level=info msg="Start streaming server" Nov 24 06:48:20.427280 containerd[1631]: time="2025-11-24T06:48:20.427273903Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Nov 24 06:48:20.427321 containerd[1631]: time="2025-11-24T06:48:20.427311597Z" level=info msg="runtime interface starting up..." Nov 24 06:48:20.427348 containerd[1631]: time="2025-11-24T06:48:20.427343156Z" level=info msg="starting plugins..." Nov 24 06:48:20.427460 containerd[1631]: time="2025-11-24T06:48:20.427375217Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Nov 24 06:48:20.427490 containerd[1631]: time="2025-11-24T06:48:20.427449685Z" level=info msg="containerd successfully booted in 0.149540s" Nov 24 06:48:20.427512 systemd[1]: Started containerd.service - containerd container runtime. Nov 24 06:48:20.471273 tar[1620]: linux-amd64/README.md Nov 24 06:48:20.478658 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Nov 24 06:48:21.836341 systemd-networkd[1508]: ens192: Gained IPv6LL Nov 24 06:48:21.838355 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Nov 24 06:48:21.838903 systemd[1]: Reached target network-online.target - Network is Online. Nov 24 06:48:21.839921 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Nov 24 06:48:21.841105 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:48:21.842548 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Nov 24 06:48:21.865511 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Nov 24 06:48:21.880168 systemd[1]: coreos-metadata.service: Deactivated successfully. Nov 24 06:48:21.880319 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Nov 24 06:48:21.880990 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Nov 24 06:48:22.669359 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:48:22.669679 systemd[1]: Reached target multi-user.target - Multi-User System. Nov 24 06:48:22.670167 systemd[1]: Startup finished in 2.727s (kernel) + 5.329s (initrd) + 4.741s (userspace) = 12.798s. Nov 24 06:48:22.677298 (kubelet)[1791]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:48:22.701178 login[1730]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Nov 24 06:48:22.702182 login[1731]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Nov 24 06:48:22.706992 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Nov 24 06:48:22.708156 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Nov 24 06:48:22.715938 systemd-logind[1602]: New session 1 of user core. Nov 24 06:48:22.720035 systemd-logind[1602]: New session 2 of user core. Nov 24 06:48:22.725036 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Nov 24 06:48:22.728360 systemd[1]: Starting user@500.service - User Manager for UID 500... Nov 24 06:48:22.737670 (systemd)[1798]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Nov 24 06:48:22.739624 systemd-logind[1602]: New session c1 of user core. Nov 24 06:48:22.829159 systemd[1798]: Queued start job for default target default.target. Nov 24 06:48:22.833993 systemd[1798]: Created slice app.slice - User Application Slice. Nov 24 06:48:22.834015 systemd[1798]: Reached target paths.target - Paths. Nov 24 06:48:22.834042 systemd[1798]: Reached target timers.target - Timers. Nov 24 06:48:22.834716 systemd[1798]: Starting dbus.socket - D-Bus User Message Bus Socket... Nov 24 06:48:22.845564 systemd[1798]: Listening on dbus.socket - D-Bus User Message Bus Socket. Nov 24 06:48:22.845652 systemd[1798]: Reached target sockets.target - Sockets. Nov 24 06:48:22.845678 systemd[1798]: Reached target basic.target - Basic System. Nov 24 06:48:22.845698 systemd[1798]: Reached target default.target - Main User Target. Nov 24 06:48:22.845714 systemd[1798]: Startup finished in 102ms. Nov 24 06:48:22.845852 systemd[1]: Started user@500.service - User Manager for UID 500. Nov 24 06:48:22.852601 systemd[1]: Started session-1.scope - Session 1 of User core. Nov 24 06:48:22.853196 systemd[1]: Started session-2.scope - Session 2 of User core. Nov 24 06:48:23.202175 kubelet[1791]: E1124 06:48:23.202137 1791 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:48:23.203719 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:48:23.203806 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:48:23.204190 systemd[1]: kubelet.service: Consumed 608ms CPU time, 264.2M memory peak. Nov 24 06:48:33.235127 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Nov 24 06:48:33.236462 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:48:33.582095 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:48:33.594475 (kubelet)[1841]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:48:33.679167 kubelet[1841]: E1124 06:48:33.679133 1841 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:48:33.681620 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:48:33.681704 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:48:33.681893 systemd[1]: kubelet.service: Consumed 117ms CPU time, 111.2M memory peak. Nov 24 06:48:43.734935 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Nov 24 06:48:43.737316 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:48:44.103106 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:48:44.110553 (kubelet)[1856]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:48:44.149554 kubelet[1856]: E1124 06:48:44.149520 1856 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:48:44.151071 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:48:44.151218 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:48:44.151622 systemd[1]: kubelet.service: Consumed 107ms CPU time, 109.1M memory peak. Nov 24 06:48:50.270059 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Nov 24 06:48:50.271260 systemd[1]: Started sshd@0-139.178.70.102:22-147.75.109.163:38760.service - OpenSSH per-connection server daemon (147.75.109.163:38760). Nov 24 06:48:50.354580 sshd[1864]: Accepted publickey for core from 147.75.109.163 port 38760 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:48:50.355451 sshd-session[1864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:48:50.358427 systemd-logind[1602]: New session 3 of user core. Nov 24 06:48:50.374541 systemd[1]: Started session-3.scope - Session 3 of User core. Nov 24 06:48:50.428364 systemd[1]: Started sshd@1-139.178.70.102:22-147.75.109.163:38762.service - OpenSSH per-connection server daemon (147.75.109.163:38762). Nov 24 06:48:50.469606 sshd[1870]: Accepted publickey for core from 147.75.109.163 port 38762 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:48:50.470804 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:48:50.474777 systemd-logind[1602]: New session 4 of user core. Nov 24 06:48:50.480310 systemd[1]: Started session-4.scope - Session 4 of User core. Nov 24 06:48:50.529291 sshd[1873]: Connection closed by 147.75.109.163 port 38762 Nov 24 06:48:50.530241 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Nov 24 06:48:50.536870 systemd[1]: sshd@1-139.178.70.102:22-147.75.109.163:38762.service: Deactivated successfully. Nov 24 06:48:50.538003 systemd[1]: session-4.scope: Deactivated successfully. Nov 24 06:48:50.539040 systemd-logind[1602]: Session 4 logged out. Waiting for processes to exit. Nov 24 06:48:50.540807 systemd[1]: Started sshd@2-139.178.70.102:22-147.75.109.163:38778.service - OpenSSH per-connection server daemon (147.75.109.163:38778). Nov 24 06:48:50.543497 systemd-logind[1602]: Removed session 4. Nov 24 06:48:50.581877 sshd[1879]: Accepted publickey for core from 147.75.109.163 port 38778 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:48:50.582782 sshd-session[1879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:48:50.586605 systemd-logind[1602]: New session 5 of user core. Nov 24 06:48:50.595327 systemd[1]: Started session-5.scope - Session 5 of User core. Nov 24 06:48:50.642241 sshd[1882]: Connection closed by 147.75.109.163 port 38778 Nov 24 06:48:50.642568 sshd-session[1879]: pam_unix(sshd:session): session closed for user core Nov 24 06:48:50.655295 systemd[1]: sshd@2-139.178.70.102:22-147.75.109.163:38778.service: Deactivated successfully. Nov 24 06:48:50.656835 systemd[1]: session-5.scope: Deactivated successfully. Nov 24 06:48:50.658057 systemd-logind[1602]: Session 5 logged out. Waiting for processes to exit. Nov 24 06:48:50.659396 systemd[1]: Started sshd@3-139.178.70.102:22-147.75.109.163:47326.service - OpenSSH per-connection server daemon (147.75.109.163:47326). Nov 24 06:48:50.661463 systemd-logind[1602]: Removed session 5. Nov 24 06:48:50.705294 sshd[1888]: Accepted publickey for core from 147.75.109.163 port 47326 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:48:50.706484 sshd-session[1888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:48:50.711048 systemd-logind[1602]: New session 6 of user core. Nov 24 06:48:50.720338 systemd[1]: Started session-6.scope - Session 6 of User core. Nov 24 06:48:50.768530 sshd[1891]: Connection closed by 147.75.109.163 port 47326 Nov 24 06:48:50.768865 sshd-session[1888]: pam_unix(sshd:session): session closed for user core Nov 24 06:48:50.781888 systemd[1]: sshd@3-139.178.70.102:22-147.75.109.163:47326.service: Deactivated successfully. Nov 24 06:48:50.782912 systemd[1]: session-6.scope: Deactivated successfully. Nov 24 06:48:50.784148 systemd-logind[1602]: Session 6 logged out. Waiting for processes to exit. Nov 24 06:48:50.785656 systemd[1]: Started sshd@4-139.178.70.102:22-147.75.109.163:47340.service - OpenSSH per-connection server daemon (147.75.109.163:47340). Nov 24 06:48:50.787455 systemd-logind[1602]: Removed session 6. Nov 24 06:48:50.826046 sshd[1897]: Accepted publickey for core from 147.75.109.163 port 47340 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:48:50.826675 sshd-session[1897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:48:50.829272 systemd-logind[1602]: New session 7 of user core. Nov 24 06:48:50.836318 systemd[1]: Started session-7.scope - Session 7 of User core. Nov 24 06:48:50.949608 sudo[1901]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Nov 24 06:48:50.949766 sudo[1901]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:48:50.966480 sudo[1901]: pam_unix(sudo:session): session closed for user root Nov 24 06:48:50.967240 sshd[1900]: Connection closed by 147.75.109.163 port 47340 Nov 24 06:48:50.967535 sshd-session[1897]: pam_unix(sshd:session): session closed for user core Nov 24 06:48:50.980394 systemd[1]: sshd@4-139.178.70.102:22-147.75.109.163:47340.service: Deactivated successfully. Nov 24 06:48:50.981268 systemd[1]: session-7.scope: Deactivated successfully. Nov 24 06:48:50.981696 systemd-logind[1602]: Session 7 logged out. Waiting for processes to exit. Nov 24 06:48:50.982962 systemd[1]: Started sshd@5-139.178.70.102:22-147.75.109.163:47342.service - OpenSSH per-connection server daemon (147.75.109.163:47342). Nov 24 06:48:50.984606 systemd-logind[1602]: Removed session 7. Nov 24 06:48:51.023045 sshd[1907]: Accepted publickey for core from 147.75.109.163 port 47342 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:48:51.023821 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:48:51.026305 systemd-logind[1602]: New session 8 of user core. Nov 24 06:48:51.036355 systemd[1]: Started session-8.scope - Session 8 of User core. Nov 24 06:48:51.084381 sudo[1912]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Nov 24 06:48:51.084597 sudo[1912]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:48:51.086943 sudo[1912]: pam_unix(sudo:session): session closed for user root Nov 24 06:48:51.090040 sudo[1911]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Nov 24 06:48:51.090182 sudo[1911]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:48:51.096394 systemd[1]: Starting audit-rules.service - Load Audit Rules... Nov 24 06:48:51.120904 augenrules[1934]: No rules Nov 24 06:48:51.121479 systemd[1]: audit-rules.service: Deactivated successfully. Nov 24 06:48:51.121699 systemd[1]: Finished audit-rules.service - Load Audit Rules. Nov 24 06:48:51.122439 sudo[1911]: pam_unix(sudo:session): session closed for user root Nov 24 06:48:51.124084 sshd[1910]: Connection closed by 147.75.109.163 port 47342 Nov 24 06:48:51.123240 sshd-session[1907]: pam_unix(sshd:session): session closed for user core Nov 24 06:48:51.128893 systemd[1]: sshd@5-139.178.70.102:22-147.75.109.163:47342.service: Deactivated successfully. Nov 24 06:48:51.130049 systemd[1]: session-8.scope: Deactivated successfully. Nov 24 06:48:51.131369 systemd-logind[1602]: Session 8 logged out. Waiting for processes to exit. Nov 24 06:48:51.133004 systemd[1]: Started sshd@6-139.178.70.102:22-147.75.109.163:47344.service - OpenSSH per-connection server daemon (147.75.109.163:47344). Nov 24 06:48:51.133605 systemd-logind[1602]: Removed session 8. Nov 24 06:48:51.173047 sshd[1943]: Accepted publickey for core from 147.75.109.163 port 47344 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:48:51.174039 sshd-session[1943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:48:51.176516 systemd-logind[1602]: New session 9 of user core. Nov 24 06:48:51.188519 systemd[1]: Started session-9.scope - Session 9 of User core. Nov 24 06:48:51.236324 sudo[1947]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Nov 24 06:48:51.236681 sudo[1947]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:48:51.527762 systemd[1]: Starting docker.service - Docker Application Container Engine... Nov 24 06:48:51.537411 (dockerd)[1965]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Nov 24 06:48:51.744715 dockerd[1965]: time="2025-11-24T06:48:51.744547393Z" level=info msg="Starting up" Nov 24 06:48:51.745437 dockerd[1965]: time="2025-11-24T06:48:51.745344607Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Nov 24 06:48:51.751681 dockerd[1965]: time="2025-11-24T06:48:51.751657404Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Nov 24 06:48:51.762055 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2194953607-merged.mount: Deactivated successfully. Nov 24 06:48:51.776608 dockerd[1965]: time="2025-11-24T06:48:51.776574441Z" level=info msg="Loading containers: start." Nov 24 06:48:51.783290 kernel: Initializing XFRM netlink socket Nov 24 06:48:51.940015 systemd-networkd[1508]: docker0: Link UP Nov 24 06:48:51.941436 dockerd[1965]: time="2025-11-24T06:48:51.941397424Z" level=info msg="Loading containers: done." Nov 24 06:48:51.949393 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4054089417-merged.mount: Deactivated successfully. Nov 24 06:48:51.951931 dockerd[1965]: time="2025-11-24T06:48:51.951747377Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Nov 24 06:48:51.951931 dockerd[1965]: time="2025-11-24T06:48:51.951795614Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Nov 24 06:48:51.951931 dockerd[1965]: time="2025-11-24T06:48:51.951835413Z" level=info msg="Initializing buildkit" Nov 24 06:48:51.961127 dockerd[1965]: time="2025-11-24T06:48:51.961111421Z" level=info msg="Completed buildkit initialization" Nov 24 06:48:51.966288 dockerd[1965]: time="2025-11-24T06:48:51.966275243Z" level=info msg="Daemon has completed initialization" Nov 24 06:48:51.966416 systemd[1]: Started docker.service - Docker Application Container Engine. Nov 24 06:48:51.966740 dockerd[1965]: time="2025-11-24T06:48:51.966718274Z" level=info msg="API listen on /run/docker.sock" Nov 24 06:48:52.604619 containerd[1631]: time="2025-11-24T06:48:52.604564422Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Nov 24 06:48:53.586738 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount537617434.mount: Deactivated successfully. Nov 24 06:48:54.235161 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Nov 24 06:48:54.236731 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:48:54.477381 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:48:54.483435 (kubelet)[2242]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:48:54.532195 kubelet[2242]: E1124 06:48:54.532078 2242 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:48:54.534644 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:48:54.534747 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:48:54.535305 systemd[1]: kubelet.service: Consumed 100ms CPU time, 107.6M memory peak. Nov 24 06:48:54.826281 containerd[1631]: time="2025-11-24T06:48:54.825848526Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:54.830774 containerd[1631]: time="2025-11-24T06:48:54.830756624Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=29072183" Nov 24 06:48:54.836862 containerd[1631]: time="2025-11-24T06:48:54.836842617Z" level=info msg="ImageCreate event name:\"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:54.848272 containerd[1631]: time="2025-11-24T06:48:54.848247582Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:54.848755 containerd[1631]: time="2025-11-24T06:48:54.848739554Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"29068782\" in 2.24415322s" Nov 24 06:48:54.848811 containerd[1631]: time="2025-11-24T06:48:54.848802364Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\"" Nov 24 06:48:54.849123 containerd[1631]: time="2025-11-24T06:48:54.849092298Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Nov 24 06:48:56.724566 containerd[1631]: time="2025-11-24T06:48:56.724536330Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:56.729819 containerd[1631]: time="2025-11-24T06:48:56.729805463Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=24992010" Nov 24 06:48:56.734516 containerd[1631]: time="2025-11-24T06:48:56.734489403Z" level=info msg="ImageCreate event name:\"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:56.741827 containerd[1631]: time="2025-11-24T06:48:56.741812847Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:56.742983 containerd[1631]: time="2025-11-24T06:48:56.742968438Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"26649046\" in 1.893777061s" Nov 24 06:48:56.743047 containerd[1631]: time="2025-11-24T06:48:56.743036837Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\"" Nov 24 06:48:56.743518 containerd[1631]: time="2025-11-24T06:48:56.743418961Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Nov 24 06:48:57.919632 containerd[1631]: time="2025-11-24T06:48:57.919189311Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:57.920117 containerd[1631]: time="2025-11-24T06:48:57.920106701Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=19404248" Nov 24 06:48:57.920396 containerd[1631]: time="2025-11-24T06:48:57.920384338Z" level=info msg="ImageCreate event name:\"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:57.922085 containerd[1631]: time="2025-11-24T06:48:57.922074431Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:57.922471 containerd[1631]: time="2025-11-24T06:48:57.922387646Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"21061302\" in 1.178909083s" Nov 24 06:48:57.922851 containerd[1631]: time="2025-11-24T06:48:57.922841907Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\"" Nov 24 06:48:57.923135 containerd[1631]: time="2025-11-24T06:48:57.923122173Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Nov 24 06:48:59.070540 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2481023857.mount: Deactivated successfully. Nov 24 06:48:59.427819 containerd[1631]: time="2025-11-24T06:48:59.427746147Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:59.432291 containerd[1631]: time="2025-11-24T06:48:59.432254927Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=31161423" Nov 24 06:48:59.440095 containerd[1631]: time="2025-11-24T06:48:59.440059070Z" level=info msg="ImageCreate event name:\"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:59.445011 containerd[1631]: time="2025-11-24T06:48:59.444983740Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:48:59.445328 containerd[1631]: time="2025-11-24T06:48:59.445313707Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"31160442\" in 1.522124967s" Nov 24 06:48:59.445378 containerd[1631]: time="2025-11-24T06:48:59.445369518Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\"" Nov 24 06:48:59.445653 containerd[1631]: time="2025-11-24T06:48:59.445643303Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Nov 24 06:49:00.072359 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1055535536.mount: Deactivated successfully. Nov 24 06:49:00.797756 containerd[1631]: time="2025-11-24T06:49:00.797212702Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:00.801795 containerd[1631]: time="2025-11-24T06:49:00.801759771Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Nov 24 06:49:00.807196 containerd[1631]: time="2025-11-24T06:49:00.807172267Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:00.811922 containerd[1631]: time="2025-11-24T06:49:00.811907705Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:00.812380 containerd[1631]: time="2025-11-24T06:49:00.812368526Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.366677612s" Nov 24 06:49:00.812429 containerd[1631]: time="2025-11-24T06:49:00.812422045Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Nov 24 06:49:00.812964 containerd[1631]: time="2025-11-24T06:49:00.812887569Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Nov 24 06:49:01.350989 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3824318313.mount: Deactivated successfully. Nov 24 06:49:01.353887 containerd[1631]: time="2025-11-24T06:49:01.353425204Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Nov 24 06:49:01.353887 containerd[1631]: time="2025-11-24T06:49:01.353783667Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Nov 24 06:49:01.353887 containerd[1631]: time="2025-11-24T06:49:01.353865423Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Nov 24 06:49:01.354878 containerd[1631]: time="2025-11-24T06:49:01.354866302Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Nov 24 06:49:01.355262 containerd[1631]: time="2025-11-24T06:49:01.355245960Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 542.280091ms" Nov 24 06:49:01.355292 containerd[1631]: time="2025-11-24T06:49:01.355262607Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Nov 24 06:49:01.355587 containerd[1631]: time="2025-11-24T06:49:01.355576230Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Nov 24 06:49:02.004822 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount776065753.mount: Deactivated successfully. Nov 24 06:49:03.795175 containerd[1631]: time="2025-11-24T06:49:03.795130264Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:03.800218 containerd[1631]: time="2025-11-24T06:49:03.800178823Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Nov 24 06:49:03.805347 containerd[1631]: time="2025-11-24T06:49:03.805312928Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:03.807240 containerd[1631]: time="2025-11-24T06:49:03.807206812Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:03.807829 containerd[1631]: time="2025-11-24T06:49:03.807717963Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.451947565s" Nov 24 06:49:03.807829 containerd[1631]: time="2025-11-24T06:49:03.807738955Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Nov 24 06:49:04.735715 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Nov 24 06:49:04.739335 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:49:04.886302 update_engine[1608]: I20251124 06:49:04.885835 1608 update_attempter.cc:509] Updating boot flags... Nov 24 06:49:05.280316 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:49:05.283462 (kubelet)[2423]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:49:05.381790 kubelet[2423]: E1124 06:49:05.381753 2423 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:49:05.383269 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:49:05.383352 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:49:05.383680 systemd[1]: kubelet.service: Consumed 101ms CPU time, 107.7M memory peak. Nov 24 06:49:06.100330 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:49:06.100636 systemd[1]: kubelet.service: Consumed 101ms CPU time, 107.7M memory peak. Nov 24 06:49:06.102574 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:49:06.122313 systemd[1]: Reload requested from client PID 2437 ('systemctl') (unit session-9.scope)... Nov 24 06:49:06.122322 systemd[1]: Reloading... Nov 24 06:49:06.188122 zram_generator::config[2483]: No configuration found. Nov 24 06:49:06.261087 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:49:06.328290 systemd[1]: Reloading finished in 205 ms. Nov 24 06:49:06.525372 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Nov 24 06:49:06.525439 systemd[1]: kubelet.service: Failed with result 'signal'. Nov 24 06:49:06.525619 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:49:06.527194 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:49:07.153520 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:49:07.160478 (kubelet)[2548]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Nov 24 06:49:07.222795 kubelet[2548]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:49:07.222795 kubelet[2548]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Nov 24 06:49:07.222795 kubelet[2548]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:49:07.222795 kubelet[2548]: I1124 06:49:07.222189 2548 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 24 06:49:07.476868 kubelet[2548]: I1124 06:49:07.476554 2548 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Nov 24 06:49:07.476868 kubelet[2548]: I1124 06:49:07.476580 2548 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 24 06:49:07.476993 kubelet[2548]: I1124 06:49:07.476883 2548 server.go:954] "Client rotation is on, will bootstrap in background" Nov 24 06:49:07.561784 kubelet[2548]: E1124 06:49:07.561603 2548 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.102:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:07.563319 kubelet[2548]: I1124 06:49:07.563299 2548 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Nov 24 06:49:07.571708 kubelet[2548]: I1124 06:49:07.571686 2548 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 24 06:49:07.577009 kubelet[2548]: I1124 06:49:07.576930 2548 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Nov 24 06:49:07.579440 kubelet[2548]: I1124 06:49:07.579174 2548 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 24 06:49:07.579440 kubelet[2548]: I1124 06:49:07.579200 2548 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 24 06:49:07.580980 kubelet[2548]: I1124 06:49:07.580967 2548 topology_manager.go:138] "Creating topology manager with none policy" Nov 24 06:49:07.581054 kubelet[2548]: I1124 06:49:07.581045 2548 container_manager_linux.go:304] "Creating device plugin manager" Nov 24 06:49:07.582154 kubelet[2548]: I1124 06:49:07.582144 2548 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:49:07.587994 kubelet[2548]: I1124 06:49:07.587847 2548 kubelet.go:446] "Attempting to sync node with API server" Nov 24 06:49:07.587994 kubelet[2548]: I1124 06:49:07.587883 2548 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 24 06:49:07.589394 kubelet[2548]: I1124 06:49:07.589146 2548 kubelet.go:352] "Adding apiserver pod source" Nov 24 06:49:07.589394 kubelet[2548]: I1124 06:49:07.589187 2548 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 24 06:49:07.592377 kubelet[2548]: W1124 06:49:07.592206 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:07.592597 kubelet[2548]: E1124 06:49:07.592573 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:07.593889 kubelet[2548]: I1124 06:49:07.593772 2548 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Nov 24 06:49:07.608597 kubelet[2548]: I1124 06:49:07.608573 2548 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 24 06:49:07.616272 kubelet[2548]: W1124 06:49:07.615482 2548 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Nov 24 06:49:07.616272 kubelet[2548]: I1124 06:49:07.616053 2548 watchdog_linux.go:99] "Systemd watchdog is not enabled" Nov 24 06:49:07.616272 kubelet[2548]: I1124 06:49:07.616084 2548 server.go:1287] "Started kubelet" Nov 24 06:49:07.634392 kubelet[2548]: W1124 06:49:07.634351 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:07.634473 kubelet[2548]: E1124 06:49:07.634394 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:07.648018 kubelet[2548]: I1124 06:49:07.647882 2548 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Nov 24 06:49:07.669979 kubelet[2548]: I1124 06:49:07.669767 2548 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 24 06:49:07.679173 kubelet[2548]: I1124 06:49:07.679092 2548 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 24 06:49:07.733503 kubelet[2548]: I1124 06:49:07.732845 2548 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 24 06:49:07.744069 kubelet[2548]: I1124 06:49:07.743385 2548 server.go:479] "Adding debug handlers to kubelet server" Nov 24 06:49:07.744561 kubelet[2548]: I1124 06:49:07.744546 2548 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Nov 24 06:49:07.747087 kubelet[2548]: I1124 06:49:07.746516 2548 volume_manager.go:297] "Starting Kubelet Volume Manager" Nov 24 06:49:07.747087 kubelet[2548]: E1124 06:49:07.746652 2548 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Nov 24 06:49:07.748526 kubelet[2548]: I1124 06:49:07.748510 2548 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Nov 24 06:49:07.748562 kubelet[2548]: I1124 06:49:07.748545 2548 reconciler.go:26] "Reconciler: start to sync state" Nov 24 06:49:07.766937 kubelet[2548]: E1124 06:49:07.766706 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="200ms" Nov 24 06:49:07.766937 kubelet[2548]: W1124 06:49:07.766890 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:07.766937 kubelet[2548]: E1124 06:49:07.766918 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:07.772495 kubelet[2548]: E1124 06:49:07.753199 2548 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.102:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.102:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.187ade97cc2f8630 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-11-24 06:49:07.616065072 +0000 UTC m=+0.453025489,LastTimestamp:2025-11-24 06:49:07.616065072 +0000 UTC m=+0.453025489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Nov 24 06:49:07.773586 kubelet[2548]: I1124 06:49:07.773574 2548 factory.go:221] Registration of the systemd container factory successfully Nov 24 06:49:07.773765 kubelet[2548]: I1124 06:49:07.773753 2548 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Nov 24 06:49:07.781047 kubelet[2548]: I1124 06:49:07.781025 2548 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 24 06:49:07.781803 kubelet[2548]: I1124 06:49:07.781794 2548 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 24 06:49:07.781860 kubelet[2548]: I1124 06:49:07.781848 2548 status_manager.go:227] "Starting to sync pod status with apiserver" Nov 24 06:49:07.781932 kubelet[2548]: I1124 06:49:07.781925 2548 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Nov 24 06:49:07.781979 kubelet[2548]: I1124 06:49:07.781975 2548 kubelet.go:2382] "Starting kubelet main sync loop" Nov 24 06:49:07.782041 kubelet[2548]: E1124 06:49:07.782027 2548 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 24 06:49:07.786152 kubelet[2548]: W1124 06:49:07.786128 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:07.786239 kubelet[2548]: E1124 06:49:07.786216 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:07.786581 kubelet[2548]: I1124 06:49:07.786572 2548 factory.go:221] Registration of the containerd container factory successfully Nov 24 06:49:07.807149 kubelet[2548]: E1124 06:49:07.807125 2548 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Nov 24 06:49:07.809882 kubelet[2548]: I1124 06:49:07.809868 2548 cpu_manager.go:221] "Starting CPU manager" policy="none" Nov 24 06:49:07.809882 kubelet[2548]: I1124 06:49:07.809878 2548 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Nov 24 06:49:07.810001 kubelet[2548]: I1124 06:49:07.809890 2548 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:49:07.813397 kubelet[2548]: I1124 06:49:07.813375 2548 policy_none.go:49] "None policy: Start" Nov 24 06:49:07.813397 kubelet[2548]: I1124 06:49:07.813394 2548 memory_manager.go:186] "Starting memorymanager" policy="None" Nov 24 06:49:07.813397 kubelet[2548]: I1124 06:49:07.813401 2548 state_mem.go:35] "Initializing new in-memory state store" Nov 24 06:49:07.819103 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Nov 24 06:49:07.831827 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Nov 24 06:49:07.834726 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Nov 24 06:49:07.847493 kubelet[2548]: E1124 06:49:07.847460 2548 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Nov 24 06:49:07.849045 kubelet[2548]: I1124 06:49:07.849025 2548 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 24 06:49:07.849223 kubelet[2548]: I1124 06:49:07.849210 2548 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 24 06:49:07.850479 kubelet[2548]: I1124 06:49:07.850452 2548 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 24 06:49:07.850604 kubelet[2548]: I1124 06:49:07.850593 2548 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 24 06:49:07.852198 kubelet[2548]: E1124 06:49:07.852179 2548 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Nov 24 06:49:07.852291 kubelet[2548]: E1124 06:49:07.852216 2548 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Nov 24 06:49:07.889514 systemd[1]: Created slice kubepods-burstable-pod55d9ac750f8c9141f337af8b08cf5c9d.slice - libcontainer container kubepods-burstable-pod55d9ac750f8c9141f337af8b08cf5c9d.slice. Nov 24 06:49:07.918615 kubelet[2548]: E1124 06:49:07.918493 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:07.921363 systemd[1]: Created slice kubepods-burstable-pod0a68423804124305a9de061f38780871.slice - libcontainer container kubepods-burstable-pod0a68423804124305a9de061f38780871.slice. Nov 24 06:49:07.922732 kubelet[2548]: E1124 06:49:07.922722 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:07.924387 systemd[1]: Created slice kubepods-burstable-pod12f9984155c7b92e56a65940e66c34f9.slice - libcontainer container kubepods-burstable-pod12f9984155c7b92e56a65940e66c34f9.slice. Nov 24 06:49:07.925358 kubelet[2548]: E1124 06:49:07.925347 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:07.951653 kubelet[2548]: I1124 06:49:07.951618 2548 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:49:07.951919 kubelet[2548]: E1124 06:49:07.951857 2548 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Nov 24 06:49:07.987654 kubelet[2548]: E1124 06:49:07.987617 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="400ms" Nov 24 06:49:08.049962 kubelet[2548]: I1124 06:49:08.049777 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/12f9984155c7b92e56a65940e66c34f9-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"12f9984155c7b92e56a65940e66c34f9\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:08.049962 kubelet[2548]: I1124 06:49:08.049807 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:08.049962 kubelet[2548]: I1124 06:49:08.049835 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:08.049962 kubelet[2548]: I1124 06:49:08.049848 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:08.049962 kubelet[2548]: I1124 06:49:08.049860 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a68423804124305a9de061f38780871-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0a68423804124305a9de061f38780871\") " pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:08.050144 kubelet[2548]: I1124 06:49:08.049871 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/12f9984155c7b92e56a65940e66c34f9-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"12f9984155c7b92e56a65940e66c34f9\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:08.050144 kubelet[2548]: I1124 06:49:08.049883 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/12f9984155c7b92e56a65940e66c34f9-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"12f9984155c7b92e56a65940e66c34f9\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:08.050144 kubelet[2548]: I1124 06:49:08.049895 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:08.050144 kubelet[2548]: I1124 06:49:08.049905 2548 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:08.153644 kubelet[2548]: I1124 06:49:08.153617 2548 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:49:08.153950 kubelet[2548]: E1124 06:49:08.153927 2548 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Nov 24 06:49:08.219861 containerd[1631]: time="2025-11-24T06:49:08.219553127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:55d9ac750f8c9141f337af8b08cf5c9d,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:08.223332 containerd[1631]: time="2025-11-24T06:49:08.223299158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0a68423804124305a9de061f38780871,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:08.226552 containerd[1631]: time="2025-11-24T06:49:08.226531344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:12f9984155c7b92e56a65940e66c34f9,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:08.388106 kubelet[2548]: E1124 06:49:08.388018 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="800ms" Nov 24 06:49:08.508190 containerd[1631]: time="2025-11-24T06:49:08.508105002Z" level=info msg="connecting to shim 5d14b5f245e97e52aca329f28954b68f2d7469f40f2da68ce10450f2c0f63190" address="unix:///run/containerd/s/916ed41be0b3589c63c2df6d73d9ed4de04d7c4aa9af1657603f9d97dbd5b5b4" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:08.509039 containerd[1631]: time="2025-11-24T06:49:08.508970662Z" level=info msg="connecting to shim 29ba7a4c1b0e418762b882813a2826bb268e9cecd07522051bc745eded28fd14" address="unix:///run/containerd/s/b66302f4bb525adc5555c3a37d2046993662656bb80bfd6c9ce8f8638abe215e" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:08.513965 containerd[1631]: time="2025-11-24T06:49:08.513902095Z" level=info msg="connecting to shim de9e0e358352edd12f2d531d3d31e9435fa6e20aa720bfb2a10d04720735fa2e" address="unix:///run/containerd/s/6225a391184ee8e88db73c6f530fd93810c7318937013565aa2639ea01533aab" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:08.561060 kubelet[2548]: I1124 06:49:08.560249 2548 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:49:08.561060 kubelet[2548]: E1124 06:49:08.560474 2548 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Nov 24 06:49:08.563266 kubelet[2548]: W1124 06:49:08.561739 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:08.563332 kubelet[2548]: E1124 06:49:08.563268 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:08.599389 systemd[1]: Started cri-containerd-29ba7a4c1b0e418762b882813a2826bb268e9cecd07522051bc745eded28fd14.scope - libcontainer container 29ba7a4c1b0e418762b882813a2826bb268e9cecd07522051bc745eded28fd14. Nov 24 06:49:08.600825 systemd[1]: Started cri-containerd-5d14b5f245e97e52aca329f28954b68f2d7469f40f2da68ce10450f2c0f63190.scope - libcontainer container 5d14b5f245e97e52aca329f28954b68f2d7469f40f2da68ce10450f2c0f63190. Nov 24 06:49:08.602349 systemd[1]: Started cri-containerd-de9e0e358352edd12f2d531d3d31e9435fa6e20aa720bfb2a10d04720735fa2e.scope - libcontainer container de9e0e358352edd12f2d531d3d31e9435fa6e20aa720bfb2a10d04720735fa2e. Nov 24 06:49:08.672324 containerd[1631]: time="2025-11-24T06:49:08.672266938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0a68423804124305a9de061f38780871,Namespace:kube-system,Attempt:0,} returns sandbox id \"de9e0e358352edd12f2d531d3d31e9435fa6e20aa720bfb2a10d04720735fa2e\"" Nov 24 06:49:08.673087 containerd[1631]: time="2025-11-24T06:49:08.673070232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:12f9984155c7b92e56a65940e66c34f9,Namespace:kube-system,Attempt:0,} returns sandbox id \"5d14b5f245e97e52aca329f28954b68f2d7469f40f2da68ce10450f2c0f63190\"" Nov 24 06:49:08.683955 containerd[1631]: time="2025-11-24T06:49:08.683928554Z" level=info msg="CreateContainer within sandbox \"de9e0e358352edd12f2d531d3d31e9435fa6e20aa720bfb2a10d04720735fa2e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Nov 24 06:49:08.685741 containerd[1631]: time="2025-11-24T06:49:08.685359867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:55d9ac750f8c9141f337af8b08cf5c9d,Namespace:kube-system,Attempt:0,} returns sandbox id \"29ba7a4c1b0e418762b882813a2826bb268e9cecd07522051bc745eded28fd14\"" Nov 24 06:49:08.689172 kubelet[2548]: W1124 06:49:08.689096 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:08.689584 kubelet[2548]: E1124 06:49:08.689548 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:08.690912 containerd[1631]: time="2025-11-24T06:49:08.690875316Z" level=info msg="CreateContainer within sandbox \"5d14b5f245e97e52aca329f28954b68f2d7469f40f2da68ce10450f2c0f63190\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Nov 24 06:49:08.695918 containerd[1631]: time="2025-11-24T06:49:08.695845092Z" level=info msg="CreateContainer within sandbox \"29ba7a4c1b0e418762b882813a2826bb268e9cecd07522051bc745eded28fd14\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Nov 24 06:49:08.704848 containerd[1631]: time="2025-11-24T06:49:08.704333191Z" level=info msg="Container 85cef64c34a1be8552af7fe1c516a8719342a3ed45a295e760e76049463fef4c: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:08.706838 containerd[1631]: time="2025-11-24T06:49:08.706644259Z" level=info msg="Container 2de65791e5f801bbb3a402a000d012eeeadec4aec4a535e0b08fac99c0a6a844: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:08.710672 containerd[1631]: time="2025-11-24T06:49:08.710639005Z" level=info msg="CreateContainer within sandbox \"de9e0e358352edd12f2d531d3d31e9435fa6e20aa720bfb2a10d04720735fa2e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"85cef64c34a1be8552af7fe1c516a8719342a3ed45a295e760e76049463fef4c\"" Nov 24 06:49:08.711259 containerd[1631]: time="2025-11-24T06:49:08.711027798Z" level=info msg="Container 989130215421d4be955b85aec7a4da78c0f86cc7370511b4d210f48e3635c540: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:08.711491 containerd[1631]: time="2025-11-24T06:49:08.711475016Z" level=info msg="StartContainer for \"85cef64c34a1be8552af7fe1c516a8719342a3ed45a295e760e76049463fef4c\"" Nov 24 06:49:08.713366 containerd[1631]: time="2025-11-24T06:49:08.713340173Z" level=info msg="CreateContainer within sandbox \"5d14b5f245e97e52aca329f28954b68f2d7469f40f2da68ce10450f2c0f63190\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"2de65791e5f801bbb3a402a000d012eeeadec4aec4a535e0b08fac99c0a6a844\"" Nov 24 06:49:08.713923 containerd[1631]: time="2025-11-24T06:49:08.713898269Z" level=info msg="connecting to shim 85cef64c34a1be8552af7fe1c516a8719342a3ed45a295e760e76049463fef4c" address="unix:///run/containerd/s/6225a391184ee8e88db73c6f530fd93810c7318937013565aa2639ea01533aab" protocol=ttrpc version=3 Nov 24 06:49:08.714151 containerd[1631]: time="2025-11-24T06:49:08.714129008Z" level=info msg="StartContainer for \"2de65791e5f801bbb3a402a000d012eeeadec4aec4a535e0b08fac99c0a6a844\"" Nov 24 06:49:08.716249 containerd[1631]: time="2025-11-24T06:49:08.716006640Z" level=info msg="connecting to shim 2de65791e5f801bbb3a402a000d012eeeadec4aec4a535e0b08fac99c0a6a844" address="unix:///run/containerd/s/916ed41be0b3589c63c2df6d73d9ed4de04d7c4aa9af1657603f9d97dbd5b5b4" protocol=ttrpc version=3 Nov 24 06:49:08.717832 containerd[1631]: time="2025-11-24T06:49:08.717807393Z" level=info msg="CreateContainer within sandbox \"29ba7a4c1b0e418762b882813a2826bb268e9cecd07522051bc745eded28fd14\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"989130215421d4be955b85aec7a4da78c0f86cc7370511b4d210f48e3635c540\"" Nov 24 06:49:08.718596 containerd[1631]: time="2025-11-24T06:49:08.718562797Z" level=info msg="StartContainer for \"989130215421d4be955b85aec7a4da78c0f86cc7370511b4d210f48e3635c540\"" Nov 24 06:49:08.719415 containerd[1631]: time="2025-11-24T06:49:08.719386988Z" level=info msg="connecting to shim 989130215421d4be955b85aec7a4da78c0f86cc7370511b4d210f48e3635c540" address="unix:///run/containerd/s/b66302f4bb525adc5555c3a37d2046993662656bb80bfd6c9ce8f8638abe215e" protocol=ttrpc version=3 Nov 24 06:49:08.742446 systemd[1]: Started cri-containerd-85cef64c34a1be8552af7fe1c516a8719342a3ed45a295e760e76049463fef4c.scope - libcontainer container 85cef64c34a1be8552af7fe1c516a8719342a3ed45a295e760e76049463fef4c. Nov 24 06:49:08.750384 systemd[1]: Started cri-containerd-2de65791e5f801bbb3a402a000d012eeeadec4aec4a535e0b08fac99c0a6a844.scope - libcontainer container 2de65791e5f801bbb3a402a000d012eeeadec4aec4a535e0b08fac99c0a6a844. Nov 24 06:49:08.751932 systemd[1]: Started cri-containerd-989130215421d4be955b85aec7a4da78c0f86cc7370511b4d210f48e3635c540.scope - libcontainer container 989130215421d4be955b85aec7a4da78c0f86cc7370511b4d210f48e3635c540. Nov 24 06:49:08.807983 containerd[1631]: time="2025-11-24T06:49:08.807429055Z" level=info msg="StartContainer for \"2de65791e5f801bbb3a402a000d012eeeadec4aec4a535e0b08fac99c0a6a844\" returns successfully" Nov 24 06:49:08.816130 containerd[1631]: time="2025-11-24T06:49:08.816056769Z" level=info msg="StartContainer for \"989130215421d4be955b85aec7a4da78c0f86cc7370511b4d210f48e3635c540\" returns successfully" Nov 24 06:49:08.836480 containerd[1631]: time="2025-11-24T06:49:08.836453290Z" level=info msg="StartContainer for \"85cef64c34a1be8552af7fe1c516a8719342a3ed45a295e760e76049463fef4c\" returns successfully" Nov 24 06:49:08.839628 kubelet[2548]: E1124 06:49:08.839460 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:08.842278 kubelet[2548]: E1124 06:49:08.842259 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:09.041621 kubelet[2548]: W1124 06:49:09.041584 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:09.041621 kubelet[2548]: E1124 06:49:09.041627 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:09.189146 kubelet[2548]: E1124 06:49:09.189111 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="1.6s" Nov 24 06:49:09.253761 kubelet[2548]: W1124 06:49:09.253705 2548 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.102:6443: connect: connection refused Nov 24 06:49:09.253761 kubelet[2548]: E1124 06:49:09.253746 2548 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:09.361560 kubelet[2548]: I1124 06:49:09.361309 2548 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:49:09.361560 kubelet[2548]: E1124 06:49:09.361490 2548 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Nov 24 06:49:09.411401 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount462372781.mount: Deactivated successfully. Nov 24 06:49:09.761684 kubelet[2548]: E1124 06:49:09.761657 2548 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.102:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" Nov 24 06:49:09.845186 kubelet[2548]: E1124 06:49:09.845147 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:09.846421 kubelet[2548]: E1124 06:49:09.846352 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:10.846515 kubelet[2548]: E1124 06:49:10.846476 2548 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:49:10.963399 kubelet[2548]: I1124 06:49:10.963213 2548 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:49:11.256121 kubelet[2548]: E1124 06:49:11.256100 2548 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Nov 24 06:49:11.379071 kubelet[2548]: I1124 06:49:11.378627 2548 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Nov 24 06:49:11.447378 kubelet[2548]: I1124 06:49:11.447348 2548 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:11.466850 kubelet[2548]: E1124 06:49:11.466797 2548 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:11.466850 kubelet[2548]: I1124 06:49:11.466824 2548 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:11.469370 kubelet[2548]: E1124 06:49:11.469344 2548 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:11.469370 kubelet[2548]: I1124 06:49:11.469361 2548 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:11.470759 kubelet[2548]: E1124 06:49:11.470740 2548 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:11.618677 kubelet[2548]: I1124 06:49:11.618557 2548 apiserver.go:52] "Watching apiserver" Nov 24 06:49:11.634410 kubelet[2548]: I1124 06:49:11.634305 2548 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:11.636308 kubelet[2548]: E1124 06:49:11.636249 2548 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:11.649389 kubelet[2548]: I1124 06:49:11.649365 2548 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Nov 24 06:49:11.846703 kubelet[2548]: I1124 06:49:11.846534 2548 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:11.847897 kubelet[2548]: E1124 06:49:11.847873 2548 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:13.361585 systemd[1]: Reload requested from client PID 2817 ('systemctl') (unit session-9.scope)... Nov 24 06:49:13.361598 systemd[1]: Reloading... Nov 24 06:49:13.366218 kubelet[2548]: I1124 06:49:13.365679 2548 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:13.434296 zram_generator::config[2866]: No configuration found. Nov 24 06:49:13.509356 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:49:13.594423 systemd[1]: Reloading finished in 232 ms. Nov 24 06:49:13.612158 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:49:13.623540 systemd[1]: kubelet.service: Deactivated successfully. Nov 24 06:49:13.623737 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:49:13.623781 systemd[1]: kubelet.service: Consumed 569ms CPU time, 131.8M memory peak. Nov 24 06:49:13.625730 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:49:13.899386 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:49:13.909483 (kubelet)[2928]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Nov 24 06:49:13.988413 kubelet[2928]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:49:13.988413 kubelet[2928]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Nov 24 06:49:13.988413 kubelet[2928]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:49:13.988682 kubelet[2928]: I1124 06:49:13.988445 2928 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 24 06:49:14.000005 kubelet[2928]: I1124 06:49:13.999845 2928 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Nov 24 06:49:14.000005 kubelet[2928]: I1124 06:49:13.999863 2928 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 24 06:49:14.000677 kubelet[2928]: I1124 06:49:14.000662 2928 server.go:954] "Client rotation is on, will bootstrap in background" Nov 24 06:49:14.003783 kubelet[2928]: I1124 06:49:14.002269 2928 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 24 06:49:14.018274 kubelet[2928]: I1124 06:49:14.018250 2928 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Nov 24 06:49:14.020686 kubelet[2928]: I1124 06:49:14.020658 2928 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 24 06:49:14.023507 kubelet[2928]: I1124 06:49:14.023488 2928 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Nov 24 06:49:14.023648 kubelet[2928]: I1124 06:49:14.023625 2928 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 24 06:49:14.023853 kubelet[2928]: I1124 06:49:14.023649 2928 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 24 06:49:14.023920 kubelet[2928]: I1124 06:49:14.023857 2928 topology_manager.go:138] "Creating topology manager with none policy" Nov 24 06:49:14.023920 kubelet[2928]: I1124 06:49:14.023868 2928 container_manager_linux.go:304] "Creating device plugin manager" Nov 24 06:49:14.023920 kubelet[2928]: I1124 06:49:14.023896 2928 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:49:14.024034 kubelet[2928]: I1124 06:49:14.024022 2928 kubelet.go:446] "Attempting to sync node with API server" Nov 24 06:49:14.024057 kubelet[2928]: I1124 06:49:14.024038 2928 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 24 06:49:14.024057 kubelet[2928]: I1124 06:49:14.024051 2928 kubelet.go:352] "Adding apiserver pod source" Nov 24 06:49:14.024098 kubelet[2928]: I1124 06:49:14.024058 2928 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 24 06:49:14.037538 kubelet[2928]: I1124 06:49:14.037516 2928 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Nov 24 06:49:14.037810 kubelet[2928]: I1124 06:49:14.037796 2928 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 24 06:49:14.038198 kubelet[2928]: I1124 06:49:14.038184 2928 watchdog_linux.go:99] "Systemd watchdog is not enabled" Nov 24 06:49:14.038286 kubelet[2928]: I1124 06:49:14.038204 2928 server.go:1287] "Started kubelet" Nov 24 06:49:14.050517 kubelet[2928]: I1124 06:49:14.050490 2928 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 24 06:49:14.059742 kubelet[2928]: I1124 06:49:14.059458 2928 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Nov 24 06:49:14.061167 kubelet[2928]: I1124 06:49:14.061133 2928 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 24 06:49:14.062994 kubelet[2928]: I1124 06:49:14.062976 2928 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 24 06:49:14.064124 kubelet[2928]: I1124 06:49:14.064106 2928 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Nov 24 06:49:14.070210 kubelet[2928]: I1124 06:49:14.069310 2928 volume_manager.go:297] "Starting Kubelet Volume Manager" Nov 24 06:49:14.070210 kubelet[2928]: I1124 06:49:14.069379 2928 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Nov 24 06:49:14.070210 kubelet[2928]: I1124 06:49:14.069438 2928 reconciler.go:26] "Reconciler: start to sync state" Nov 24 06:49:14.070210 kubelet[2928]: I1124 06:49:14.069593 2928 server.go:479] "Adding debug handlers to kubelet server" Nov 24 06:49:14.073162 kubelet[2928]: E1124 06:49:14.073140 2928 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Nov 24 06:49:14.074387 kubelet[2928]: I1124 06:49:14.074345 2928 factory.go:221] Registration of the systemd container factory successfully Nov 24 06:49:14.075020 kubelet[2928]: I1124 06:49:14.075000 2928 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Nov 24 06:49:14.076832 kubelet[2928]: I1124 06:49:14.076815 2928 factory.go:221] Registration of the containerd container factory successfully Nov 24 06:49:14.077701 kubelet[2928]: I1124 06:49:14.077678 2928 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 24 06:49:14.078326 kubelet[2928]: I1124 06:49:14.078307 2928 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 24 06:49:14.090781 kubelet[2928]: I1124 06:49:14.090502 2928 status_manager.go:227] "Starting to sync pod status with apiserver" Nov 24 06:49:14.090781 kubelet[2928]: I1124 06:49:14.090548 2928 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Nov 24 06:49:14.090781 kubelet[2928]: I1124 06:49:14.090552 2928 kubelet.go:2382] "Starting kubelet main sync loop" Nov 24 06:49:14.090781 kubelet[2928]: E1124 06:49:14.090593 2928 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 24 06:49:14.114584 kubelet[2928]: I1124 06:49:14.114567 2928 cpu_manager.go:221] "Starting CPU manager" policy="none" Nov 24 06:49:14.114745 kubelet[2928]: I1124 06:49:14.114735 2928 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Nov 24 06:49:14.114813 kubelet[2928]: I1124 06:49:14.114806 2928 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:49:14.115025 kubelet[2928]: I1124 06:49:14.115015 2928 state_mem.go:88] "Updated default CPUSet" cpuSet="" Nov 24 06:49:14.115087 kubelet[2928]: I1124 06:49:14.115071 2928 state_mem.go:96] "Updated CPUSet assignments" assignments={} Nov 24 06:49:14.115133 kubelet[2928]: I1124 06:49:14.115127 2928 policy_none.go:49] "None policy: Start" Nov 24 06:49:14.115184 kubelet[2928]: I1124 06:49:14.115178 2928 memory_manager.go:186] "Starting memorymanager" policy="None" Nov 24 06:49:14.115248 kubelet[2928]: I1124 06:49:14.115233 2928 state_mem.go:35] "Initializing new in-memory state store" Nov 24 06:49:14.115390 kubelet[2928]: I1124 06:49:14.115381 2928 state_mem.go:75] "Updated machine memory state" Nov 24 06:49:14.122091 kubelet[2928]: I1124 06:49:14.121687 2928 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 24 06:49:14.122091 kubelet[2928]: I1124 06:49:14.121815 2928 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 24 06:49:14.122091 kubelet[2928]: I1124 06:49:14.121825 2928 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 24 06:49:14.122091 kubelet[2928]: I1124 06:49:14.121962 2928 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 24 06:49:14.122908 kubelet[2928]: E1124 06:49:14.122897 2928 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Nov 24 06:49:14.191912 kubelet[2928]: I1124 06:49:14.191853 2928 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:14.210114 kubelet[2928]: I1124 06:49:14.210088 2928 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:14.210618 kubelet[2928]: I1124 06:49:14.210140 2928 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:14.224760 kubelet[2928]: I1124 06:49:14.224745 2928 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:49:14.236044 kubelet[2928]: E1124 06:49:14.235941 2928 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:14.251034 kubelet[2928]: I1124 06:49:14.251000 2928 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Nov 24 06:49:14.251319 kubelet[2928]: I1124 06:49:14.251175 2928 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Nov 24 06:49:14.270457 kubelet[2928]: I1124 06:49:14.270435 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:14.270680 kubelet[2928]: I1124 06:49:14.270562 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:14.270680 kubelet[2928]: I1124 06:49:14.270579 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:14.270680 kubelet[2928]: I1124 06:49:14.270591 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/12f9984155c7b92e56a65940e66c34f9-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"12f9984155c7b92e56a65940e66c34f9\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:14.270680 kubelet[2928]: I1124 06:49:14.270601 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/12f9984155c7b92e56a65940e66c34f9-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"12f9984155c7b92e56a65940e66c34f9\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:14.270680 kubelet[2928]: I1124 06:49:14.270614 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/12f9984155c7b92e56a65940e66c34f9-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"12f9984155c7b92e56a65940e66c34f9\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:14.270782 kubelet[2928]: I1124 06:49:14.270627 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:14.270782 kubelet[2928]: I1124 06:49:14.270636 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:14.270782 kubelet[2928]: I1124 06:49:14.270663 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a68423804124305a9de061f38780871-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0a68423804124305a9de061f38780871\") " pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:15.026113 kubelet[2928]: I1124 06:49:15.026087 2928 apiserver.go:52] "Watching apiserver" Nov 24 06:49:15.069894 kubelet[2928]: I1124 06:49:15.069863 2928 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Nov 24 06:49:15.105041 kubelet[2928]: I1124 06:49:15.105021 2928 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:15.105213 kubelet[2928]: I1124 06:49:15.105200 2928 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:15.105386 kubelet[2928]: I1124 06:49:15.105328 2928 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:15.123883 kubelet[2928]: E1124 06:49:15.123850 2928 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Nov 24 06:49:15.130592 kubelet[2928]: I1124 06:49:15.130499 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.130476027 podStartE2EDuration="2.130476027s" podCreationTimestamp="2025-11-24 06:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:49:15.130249818 +0000 UTC m=+1.201703443" watchObservedRunningTime="2025-11-24 06:49:15.130476027 +0000 UTC m=+1.201929643" Nov 24 06:49:15.130940 kubelet[2928]: E1124 06:49:15.130921 2928 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:49:15.131043 kubelet[2928]: E1124 06:49:15.131031 2928 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Nov 24 06:49:15.165971 kubelet[2928]: I1124 06:49:15.165921 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.1659091400000001 podStartE2EDuration="1.16590914s" podCreationTimestamp="2025-11-24 06:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:49:15.157090105 +0000 UTC m=+1.228543724" watchObservedRunningTime="2025-11-24 06:49:15.16590914 +0000 UTC m=+1.237362755" Nov 24 06:49:18.075202 kubelet[2928]: I1124 06:49:18.075134 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=4.075122758 podStartE2EDuration="4.075122758s" podCreationTimestamp="2025-11-24 06:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:49:15.166076906 +0000 UTC m=+1.237530530" watchObservedRunningTime="2025-11-24 06:49:18.075122758 +0000 UTC m=+4.146576377" Nov 24 06:49:18.651436 kubelet[2928]: I1124 06:49:18.651401 2928 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Nov 24 06:49:18.651746 containerd[1631]: time="2025-11-24T06:49:18.651725306Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Nov 24 06:49:18.652036 kubelet[2928]: I1124 06:49:18.651848 2928 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Nov 24 06:49:19.392370 systemd[1]: Created slice kubepods-besteffort-podd1775197_cc34_49fd_9a43_a84dc2af752e.slice - libcontainer container kubepods-besteffort-podd1775197_cc34_49fd_9a43_a84dc2af752e.slice. Nov 24 06:49:19.406261 kubelet[2928]: I1124 06:49:19.406116 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d1775197-cc34-49fd-9a43-a84dc2af752e-xtables-lock\") pod \"kube-proxy-72xf9\" (UID: \"d1775197-cc34-49fd-9a43-a84dc2af752e\") " pod="kube-system/kube-proxy-72xf9" Nov 24 06:49:19.406261 kubelet[2928]: I1124 06:49:19.406144 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d1775197-cc34-49fd-9a43-a84dc2af752e-kube-proxy\") pod \"kube-proxy-72xf9\" (UID: \"d1775197-cc34-49fd-9a43-a84dc2af752e\") " pod="kube-system/kube-proxy-72xf9" Nov 24 06:49:19.406261 kubelet[2928]: I1124 06:49:19.406180 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1775197-cc34-49fd-9a43-a84dc2af752e-lib-modules\") pod \"kube-proxy-72xf9\" (UID: \"d1775197-cc34-49fd-9a43-a84dc2af752e\") " pod="kube-system/kube-proxy-72xf9" Nov 24 06:49:19.406261 kubelet[2928]: I1124 06:49:19.406197 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7nq6\" (UniqueName: \"kubernetes.io/projected/d1775197-cc34-49fd-9a43-a84dc2af752e-kube-api-access-p7nq6\") pod \"kube-proxy-72xf9\" (UID: \"d1775197-cc34-49fd-9a43-a84dc2af752e\") " pod="kube-system/kube-proxy-72xf9" Nov 24 06:49:19.701933 containerd[1631]: time="2025-11-24T06:49:19.701708727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-72xf9,Uid:d1775197-cc34-49fd-9a43-a84dc2af752e,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:19.712371 kubelet[2928]: W1124 06:49:19.712333 2928 reflector.go:569] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Nov 24 06:49:19.712520 kubelet[2928]: W1124 06:49:19.712333 2928 reflector.go:569] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Nov 24 06:49:19.712520 kubelet[2928]: E1124 06:49:19.712438 2928 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:19.712520 kubelet[2928]: I1124 06:49:19.712467 2928 status_manager.go:890] "Failed to get status for pod" podUID="e985ee1c-2d13-48fe-b5ee-94dc585358eb" pod="tigera-operator/tigera-operator-7dcd859c48-4tcmn" err="pods \"tigera-operator-7dcd859c48-4tcmn\" is forbidden: User \"system:node:localhost\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" Nov 24 06:49:19.712687 kubelet[2928]: E1124 06:49:19.712671 2928 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:19.716351 systemd[1]: Created slice kubepods-besteffort-pode985ee1c_2d13_48fe_b5ee_94dc585358eb.slice - libcontainer container kubepods-besteffort-pode985ee1c_2d13_48fe_b5ee_94dc585358eb.slice. Nov 24 06:49:19.755926 containerd[1631]: time="2025-11-24T06:49:19.755899115Z" level=info msg="connecting to shim f887783913382023aafde34abeab7da05d4130c199c6778134ceb4a1579b4345" address="unix:///run/containerd/s/75a1f3f09e490fc8a638b495bbfcc1a58d7b044dbc8ee7bcf8d0ead17a2faf0d" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:19.779379 systemd[1]: Started cri-containerd-f887783913382023aafde34abeab7da05d4130c199c6778134ceb4a1579b4345.scope - libcontainer container f887783913382023aafde34abeab7da05d4130c199c6778134ceb4a1579b4345. Nov 24 06:49:19.802107 containerd[1631]: time="2025-11-24T06:49:19.802084378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-72xf9,Uid:d1775197-cc34-49fd-9a43-a84dc2af752e,Namespace:kube-system,Attempt:0,} returns sandbox id \"f887783913382023aafde34abeab7da05d4130c199c6778134ceb4a1579b4345\"" Nov 24 06:49:19.804445 containerd[1631]: time="2025-11-24T06:49:19.804244888Z" level=info msg="CreateContainer within sandbox \"f887783913382023aafde34abeab7da05d4130c199c6778134ceb4a1579b4345\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Nov 24 06:49:19.808440 kubelet[2928]: I1124 06:49:19.808407 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w42gq\" (UniqueName: \"kubernetes.io/projected/e985ee1c-2d13-48fe-b5ee-94dc585358eb-kube-api-access-w42gq\") pod \"tigera-operator-7dcd859c48-4tcmn\" (UID: \"e985ee1c-2d13-48fe-b5ee-94dc585358eb\") " pod="tigera-operator/tigera-operator-7dcd859c48-4tcmn" Nov 24 06:49:19.808575 kubelet[2928]: I1124 06:49:19.808566 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e985ee1c-2d13-48fe-b5ee-94dc585358eb-var-lib-calico\") pod \"tigera-operator-7dcd859c48-4tcmn\" (UID: \"e985ee1c-2d13-48fe-b5ee-94dc585358eb\") " pod="tigera-operator/tigera-operator-7dcd859c48-4tcmn" Nov 24 06:49:19.838356 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount550542308.mount: Deactivated successfully. Nov 24 06:49:19.842274 containerd[1631]: time="2025-11-24T06:49:19.842250971Z" level=info msg="Container d563c6394b1aa79255e2382d66b15df53de0f43f49f85a0663d8ab93e7c4bbfc: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:19.870927 containerd[1631]: time="2025-11-24T06:49:19.870880919Z" level=info msg="CreateContainer within sandbox \"f887783913382023aafde34abeab7da05d4130c199c6778134ceb4a1579b4345\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d563c6394b1aa79255e2382d66b15df53de0f43f49f85a0663d8ab93e7c4bbfc\"" Nov 24 06:49:19.872245 containerd[1631]: time="2025-11-24T06:49:19.871460276Z" level=info msg="StartContainer for \"d563c6394b1aa79255e2382d66b15df53de0f43f49f85a0663d8ab93e7c4bbfc\"" Nov 24 06:49:19.872510 containerd[1631]: time="2025-11-24T06:49:19.872489192Z" level=info msg="connecting to shim d563c6394b1aa79255e2382d66b15df53de0f43f49f85a0663d8ab93e7c4bbfc" address="unix:///run/containerd/s/75a1f3f09e490fc8a638b495bbfcc1a58d7b044dbc8ee7bcf8d0ead17a2faf0d" protocol=ttrpc version=3 Nov 24 06:49:19.885362 systemd[1]: Started cri-containerd-d563c6394b1aa79255e2382d66b15df53de0f43f49f85a0663d8ab93e7c4bbfc.scope - libcontainer container d563c6394b1aa79255e2382d66b15df53de0f43f49f85a0663d8ab93e7c4bbfc. Nov 24 06:49:19.935849 containerd[1631]: time="2025-11-24T06:49:19.935824289Z" level=info msg="StartContainer for \"d563c6394b1aa79255e2382d66b15df53de0f43f49f85a0663d8ab93e7c4bbfc\" returns successfully" Nov 24 06:49:20.121574 kubelet[2928]: I1124 06:49:20.121530 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-72xf9" podStartSLOduration=1.121515563 podStartE2EDuration="1.121515563s" podCreationTimestamp="2025-11-24 06:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:49:20.120868556 +0000 UTC m=+6.192322175" watchObservedRunningTime="2025-11-24 06:49:20.121515563 +0000 UTC m=+6.192969185" Nov 24 06:49:20.922264 containerd[1631]: time="2025-11-24T06:49:20.922217985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-4tcmn,Uid:e985ee1c-2d13-48fe-b5ee-94dc585358eb,Namespace:tigera-operator,Attempt:0,}" Nov 24 06:49:20.932460 containerd[1631]: time="2025-11-24T06:49:20.932417799Z" level=info msg="connecting to shim 4241fe3cd5c4c28ab21c0cb4a13886f72df274d62efa10c2f0179139e3302e73" address="unix:///run/containerd/s/57b98ce88100f4e3cf6495233f702e99d4e5bf7db4ee7a2e86ef5571115df2d3" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:20.956405 systemd[1]: Started cri-containerd-4241fe3cd5c4c28ab21c0cb4a13886f72df274d62efa10c2f0179139e3302e73.scope - libcontainer container 4241fe3cd5c4c28ab21c0cb4a13886f72df274d62efa10c2f0179139e3302e73. Nov 24 06:49:20.993087 containerd[1631]: time="2025-11-24T06:49:20.993016672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-4tcmn,Uid:e985ee1c-2d13-48fe-b5ee-94dc585358eb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4241fe3cd5c4c28ab21c0cb4a13886f72df274d62efa10c2f0179139e3302e73\"" Nov 24 06:49:20.995801 containerd[1631]: time="2025-11-24T06:49:20.995778050Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Nov 24 06:49:22.165561 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2412528282.mount: Deactivated successfully. Nov 24 06:49:22.535083 containerd[1631]: time="2025-11-24T06:49:22.535058594Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:22.535981 containerd[1631]: time="2025-11-24T06:49:22.535966676Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Nov 24 06:49:22.536278 containerd[1631]: time="2025-11-24T06:49:22.536256536Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:22.537640 containerd[1631]: time="2025-11-24T06:49:22.537597638Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:22.538445 containerd[1631]: time="2025-11-24T06:49:22.538137718Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 1.542174946s" Nov 24 06:49:22.538445 containerd[1631]: time="2025-11-24T06:49:22.538152832Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Nov 24 06:49:22.539688 containerd[1631]: time="2025-11-24T06:49:22.539660637Z" level=info msg="CreateContainer within sandbox \"4241fe3cd5c4c28ab21c0cb4a13886f72df274d62efa10c2f0179139e3302e73\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Nov 24 06:49:22.544843 containerd[1631]: time="2025-11-24T06:49:22.544825051Z" level=info msg="Container f9babada5257254423f67e34d64ef1de576a0453a9e48c4cd6bcf7557c2d9beb: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:22.548994 containerd[1631]: time="2025-11-24T06:49:22.548977262Z" level=info msg="CreateContainer within sandbox \"4241fe3cd5c4c28ab21c0cb4a13886f72df274d62efa10c2f0179139e3302e73\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f9babada5257254423f67e34d64ef1de576a0453a9e48c4cd6bcf7557c2d9beb\"" Nov 24 06:49:22.549784 containerd[1631]: time="2025-11-24T06:49:22.549766159Z" level=info msg="StartContainer for \"f9babada5257254423f67e34d64ef1de576a0453a9e48c4cd6bcf7557c2d9beb\"" Nov 24 06:49:22.551490 containerd[1631]: time="2025-11-24T06:49:22.551457789Z" level=info msg="connecting to shim f9babada5257254423f67e34d64ef1de576a0453a9e48c4cd6bcf7557c2d9beb" address="unix:///run/containerd/s/57b98ce88100f4e3cf6495233f702e99d4e5bf7db4ee7a2e86ef5571115df2d3" protocol=ttrpc version=3 Nov 24 06:49:22.571334 systemd[1]: Started cri-containerd-f9babada5257254423f67e34d64ef1de576a0453a9e48c4cd6bcf7557c2d9beb.scope - libcontainer container f9babada5257254423f67e34d64ef1de576a0453a9e48c4cd6bcf7557c2d9beb. Nov 24 06:49:22.589683 containerd[1631]: time="2025-11-24T06:49:22.589658527Z" level=info msg="StartContainer for \"f9babada5257254423f67e34d64ef1de576a0453a9e48c4cd6bcf7557c2d9beb\" returns successfully" Nov 24 06:49:23.791085 kubelet[2928]: I1124 06:49:23.791050 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-4tcmn" podStartSLOduration=3.246944226 podStartE2EDuration="4.791037319s" podCreationTimestamp="2025-11-24 06:49:19 +0000 UTC" firstStartedPulling="2025-11-24 06:49:20.994560703 +0000 UTC m=+7.066014318" lastFinishedPulling="2025-11-24 06:49:22.538653797 +0000 UTC m=+8.610107411" observedRunningTime="2025-11-24 06:49:23.124601259 +0000 UTC m=+9.196054876" watchObservedRunningTime="2025-11-24 06:49:23.791037319 +0000 UTC m=+9.862490940" Nov 24 06:49:27.616342 sudo[1947]: pam_unix(sudo:session): session closed for user root Nov 24 06:49:27.618580 sshd[1946]: Connection closed by 147.75.109.163 port 47344 Nov 24 06:49:27.621439 sshd-session[1943]: pam_unix(sshd:session): session closed for user core Nov 24 06:49:27.625895 systemd[1]: sshd@6-139.178.70.102:22-147.75.109.163:47344.service: Deactivated successfully. Nov 24 06:49:27.626502 systemd-logind[1602]: Session 9 logged out. Waiting for processes to exit. Nov 24 06:49:27.629933 systemd[1]: session-9.scope: Deactivated successfully. Nov 24 06:49:27.630080 systemd[1]: session-9.scope: Consumed 3.275s CPU time, 152.8M memory peak. Nov 24 06:49:27.632954 systemd-logind[1602]: Removed session 9. Nov 24 06:49:31.697392 systemd[1]: Created slice kubepods-besteffort-pod82161473_89b4_47ed_b624_5bc9aeecbbc7.slice - libcontainer container kubepods-besteffort-pod82161473_89b4_47ed_b624_5bc9aeecbbc7.slice. Nov 24 06:49:31.789972 kubelet[2928]: I1124 06:49:31.789904 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c77gx\" (UniqueName: \"kubernetes.io/projected/82161473-89b4-47ed-b624-5bc9aeecbbc7-kube-api-access-c77gx\") pod \"calico-typha-b4d889c99-psfk9\" (UID: \"82161473-89b4-47ed-b624-5bc9aeecbbc7\") " pod="calico-system/calico-typha-b4d889c99-psfk9" Nov 24 06:49:31.789972 kubelet[2928]: I1124 06:49:31.789939 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/82161473-89b4-47ed-b624-5bc9aeecbbc7-typha-certs\") pod \"calico-typha-b4d889c99-psfk9\" (UID: \"82161473-89b4-47ed-b624-5bc9aeecbbc7\") " pod="calico-system/calico-typha-b4d889c99-psfk9" Nov 24 06:49:31.789972 kubelet[2928]: I1124 06:49:31.789955 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82161473-89b4-47ed-b624-5bc9aeecbbc7-tigera-ca-bundle\") pod \"calico-typha-b4d889c99-psfk9\" (UID: \"82161473-89b4-47ed-b624-5bc9aeecbbc7\") " pod="calico-system/calico-typha-b4d889c99-psfk9" Nov 24 06:49:31.890390 kubelet[2928]: I1124 06:49:31.890361 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b5e47b66-affd-44cd-8137-7196fbbb55d6-node-certs\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890390 kubelet[2928]: I1124 06:49:31.890391 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-var-lib-calico\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890576 kubelet[2928]: I1124 06:49:31.890408 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7cpk\" (UniqueName: \"kubernetes.io/projected/b5e47b66-affd-44cd-8137-7196fbbb55d6-kube-api-access-k7cpk\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890576 kubelet[2928]: I1124 06:49:31.890423 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-var-run-calico\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890576 kubelet[2928]: I1124 06:49:31.890439 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-cni-bin-dir\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890576 kubelet[2928]: I1124 06:49:31.890457 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-cni-net-dir\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890576 kubelet[2928]: I1124 06:49:31.890475 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-flexvol-driver-host\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890669 kubelet[2928]: I1124 06:49:31.890484 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-policysync\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890669 kubelet[2928]: I1124 06:49:31.890498 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-cni-log-dir\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890669 kubelet[2928]: I1124 06:49:31.890508 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-lib-modules\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890669 kubelet[2928]: I1124 06:49:31.890517 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5e47b66-affd-44cd-8137-7196fbbb55d6-tigera-ca-bundle\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890669 kubelet[2928]: I1124 06:49:31.890526 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b5e47b66-affd-44cd-8137-7196fbbb55d6-xtables-lock\") pod \"calico-node-pc79c\" (UID: \"b5e47b66-affd-44cd-8137-7196fbbb55d6\") " pod="calico-system/calico-node-pc79c" Nov 24 06:49:31.890979 systemd[1]: Created slice kubepods-besteffort-podb5e47b66_affd_44cd_8137_7196fbbb55d6.slice - libcontainer container kubepods-besteffort-podb5e47b66_affd_44cd_8137_7196fbbb55d6.slice. Nov 24 06:49:31.996042 kubelet[2928]: E1124 06:49:31.995978 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.996042 kubelet[2928]: W1124 06:49:31.996009 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.996614 kubelet[2928]: E1124 06:49:31.996580 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.997003 kubelet[2928]: E1124 06:49:31.996979 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.997003 kubelet[2928]: W1124 06:49:31.996991 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.997198 kubelet[2928]: E1124 06:49:31.997177 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.997363 kubelet[2928]: E1124 06:49:31.997356 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.997431 kubelet[2928]: W1124 06:49:31.997401 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.997532 kubelet[2928]: E1124 06:49:31.997514 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.997589 kubelet[2928]: E1124 06:49:31.997584 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.997664 kubelet[2928]: W1124 06:49:31.997621 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.997664 kubelet[2928]: E1124 06:49:31.997637 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.998708 kubelet[2928]: E1124 06:49:31.998621 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.998708 kubelet[2928]: W1124 06:49:31.998631 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.998708 kubelet[2928]: E1124 06:49:31.998642 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.998909 kubelet[2928]: E1124 06:49:31.998848 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.998909 kubelet[2928]: W1124 06:49:31.998854 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.998909 kubelet[2928]: E1124 06:49:31.998861 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.999063 kubelet[2928]: E1124 06:49:31.999005 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.999063 kubelet[2928]: W1124 06:49:31.999011 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.999063 kubelet[2928]: E1124 06:49:31.999016 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.999211 kubelet[2928]: E1124 06:49:31.999204 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.999345 kubelet[2928]: W1124 06:49:31.999271 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.999345 kubelet[2928]: E1124 06:49:31.999283 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:31.999478 kubelet[2928]: E1124 06:49:31.999454 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:31.999478 kubelet[2928]: W1124 06:49:31.999459 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:31.999478 kubelet[2928]: E1124 06:49:31.999466 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.001753 containerd[1631]: time="2025-11-24T06:49:32.001715892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b4d889c99-psfk9,Uid:82161473-89b4-47ed-b624-5bc9aeecbbc7,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:32.002892 kubelet[2928]: E1124 06:49:32.001912 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.002892 kubelet[2928]: W1124 06:49:32.001921 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.002892 kubelet[2928]: E1124 06:49:32.001933 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.020873 containerd[1631]: time="2025-11-24T06:49:32.020841002Z" level=info msg="connecting to shim fd452722ab5b052da7ca7a9980a184b555f84e74b8f52cac8973c48c292438fe" address="unix:///run/containerd/s/a7bba525b5253224328185540d91ff8f792574dcb3683e0c10e9b0eccab59763" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:32.039395 systemd[1]: Started cri-containerd-fd452722ab5b052da7ca7a9980a184b555f84e74b8f52cac8973c48c292438fe.scope - libcontainer container fd452722ab5b052da7ca7a9980a184b555f84e74b8f52cac8973c48c292438fe. Nov 24 06:49:32.086900 containerd[1631]: time="2025-11-24T06:49:32.086816055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b4d889c99-psfk9,Uid:82161473-89b4-47ed-b624-5bc9aeecbbc7,Namespace:calico-system,Attempt:0,} returns sandbox id \"fd452722ab5b052da7ca7a9980a184b555f84e74b8f52cac8973c48c292438fe\"" Nov 24 06:49:32.088553 containerd[1631]: time="2025-11-24T06:49:32.088072376Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Nov 24 06:49:32.156823 kubelet[2928]: E1124 06:49:32.156780 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:32.187064 kubelet[2928]: E1124 06:49:32.187040 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.187457 kubelet[2928]: W1124 06:49:32.187384 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.187577 kubelet[2928]: E1124 06:49:32.187517 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.188077 kubelet[2928]: E1124 06:49:32.187975 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.188077 kubelet[2928]: W1124 06:49:32.187991 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.188077 kubelet[2928]: E1124 06:49:32.188001 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.188518 kubelet[2928]: E1124 06:49:32.188465 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.188518 kubelet[2928]: W1124 06:49:32.188475 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.188518 kubelet[2928]: E1124 06:49:32.188481 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.189154 kubelet[2928]: E1124 06:49:32.189041 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.189154 kubelet[2928]: W1124 06:49:32.189050 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.189154 kubelet[2928]: E1124 06:49:32.189058 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.190319 kubelet[2928]: E1124 06:49:32.190269 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.190319 kubelet[2928]: W1124 06:49:32.190281 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.190319 kubelet[2928]: E1124 06:49:32.190291 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.190582 kubelet[2928]: E1124 06:49:32.190547 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.190582 kubelet[2928]: W1124 06:49:32.190554 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.190582 kubelet[2928]: E1124 06:49:32.190561 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.191015 kubelet[2928]: E1124 06:49:32.190984 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.191015 kubelet[2928]: W1124 06:49:32.190996 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.191241 kubelet[2928]: E1124 06:49:32.191091 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.191670 kubelet[2928]: E1124 06:49:32.191536 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.191670 kubelet[2928]: W1124 06:49:32.191546 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.191670 kubelet[2928]: E1124 06:49:32.191555 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.192318 kubelet[2928]: E1124 06:49:32.192261 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.192504 kubelet[2928]: W1124 06:49:32.192458 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.192504 kubelet[2928]: E1124 06:49:32.192473 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.192995 kubelet[2928]: E1124 06:49:32.192859 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.192995 kubelet[2928]: W1124 06:49:32.192868 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.192995 kubelet[2928]: E1124 06:49:32.192877 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.193445 kubelet[2928]: E1124 06:49:32.193374 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.193655 kubelet[2928]: W1124 06:49:32.193529 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.193655 kubelet[2928]: E1124 06:49:32.193541 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.194201 kubelet[2928]: E1124 06:49:32.194057 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.194201 kubelet[2928]: W1124 06:49:32.194066 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.194201 kubelet[2928]: E1124 06:49:32.194073 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.194702 kubelet[2928]: E1124 06:49:32.194573 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.194702 kubelet[2928]: W1124 06:49:32.194584 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.194702 kubelet[2928]: E1124 06:49:32.194591 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.195136 kubelet[2928]: E1124 06:49:32.195067 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.195136 kubelet[2928]: W1124 06:49:32.195075 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.195136 kubelet[2928]: E1124 06:49:32.195083 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.195673 kubelet[2928]: E1124 06:49:32.195550 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.195673 kubelet[2928]: W1124 06:49:32.195560 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.195673 kubelet[2928]: E1124 06:49:32.195567 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.196183 kubelet[2928]: E1124 06:49:32.196029 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.196183 kubelet[2928]: W1124 06:49:32.196038 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.196183 kubelet[2928]: E1124 06:49:32.196046 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.196589 kubelet[2928]: E1124 06:49:32.196484 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.196770 kubelet[2928]: W1124 06:49:32.196645 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.196770 kubelet[2928]: E1124 06:49:32.196663 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.197118 kubelet[2928]: E1124 06:49:32.197019 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.197464 kubelet[2928]: W1124 06:49:32.197298 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.197464 kubelet[2928]: E1124 06:49:32.197313 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.197984 kubelet[2928]: E1124 06:49:32.197821 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.197984 kubelet[2928]: W1124 06:49:32.197851 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.197984 kubelet[2928]: E1124 06:49:32.197861 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.198765 kubelet[2928]: E1124 06:49:32.198347 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.198765 kubelet[2928]: W1124 06:49:32.198355 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.198765 kubelet[2928]: E1124 06:49:32.198362 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.198765 kubelet[2928]: E1124 06:49:32.198589 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.198765 kubelet[2928]: W1124 06:49:32.198599 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.198765 kubelet[2928]: E1124 06:49:32.198608 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.198765 kubelet[2928]: I1124 06:49:32.198628 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7x2\" (UniqueName: \"kubernetes.io/projected/0c6abf64-6464-41f7-b11b-979ba6b72128-kube-api-access-2s7x2\") pod \"csi-node-driver-n9f2t\" (UID: \"0c6abf64-6464-41f7-b11b-979ba6b72128\") " pod="calico-system/csi-node-driver-n9f2t" Nov 24 06:49:32.198765 kubelet[2928]: E1124 06:49:32.198709 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.198765 kubelet[2928]: W1124 06:49:32.198716 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.198965 kubelet[2928]: E1124 06:49:32.198725 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.198965 kubelet[2928]: I1124 06:49:32.198736 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c6abf64-6464-41f7-b11b-979ba6b72128-kubelet-dir\") pod \"csi-node-driver-n9f2t\" (UID: \"0c6abf64-6464-41f7-b11b-979ba6b72128\") " pod="calico-system/csi-node-driver-n9f2t" Nov 24 06:49:32.198965 kubelet[2928]: E1124 06:49:32.198850 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.198965 kubelet[2928]: W1124 06:49:32.198855 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.198965 kubelet[2928]: E1124 06:49:32.198860 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.198965 kubelet[2928]: I1124 06:49:32.198868 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0c6abf64-6464-41f7-b11b-979ba6b72128-socket-dir\") pod \"csi-node-driver-n9f2t\" (UID: \"0c6abf64-6464-41f7-b11b-979ba6b72128\") " pod="calico-system/csi-node-driver-n9f2t" Nov 24 06:49:32.198965 kubelet[2928]: E1124 06:49:32.198964 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199103 kubelet[2928]: W1124 06:49:32.198968 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199103 kubelet[2928]: E1124 06:49:32.198974 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.199103 kubelet[2928]: I1124 06:49:32.198982 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0c6abf64-6464-41f7-b11b-979ba6b72128-varrun\") pod \"csi-node-driver-n9f2t\" (UID: \"0c6abf64-6464-41f7-b11b-979ba6b72128\") " pod="calico-system/csi-node-driver-n9f2t" Nov 24 06:49:32.199103 kubelet[2928]: E1124 06:49:32.199061 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199103 kubelet[2928]: W1124 06:49:32.199065 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199103 kubelet[2928]: E1124 06:49:32.199071 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.199103 kubelet[2928]: I1124 06:49:32.199079 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0c6abf64-6464-41f7-b11b-979ba6b72128-registration-dir\") pod \"csi-node-driver-n9f2t\" (UID: \"0c6abf64-6464-41f7-b11b-979ba6b72128\") " pod="calico-system/csi-node-driver-n9f2t" Nov 24 06:49:32.199263 kubelet[2928]: E1124 06:49:32.199150 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199263 kubelet[2928]: W1124 06:49:32.199155 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199263 kubelet[2928]: E1124 06:49:32.199160 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.199263 kubelet[2928]: E1124 06:49:32.199242 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199263 kubelet[2928]: W1124 06:49:32.199249 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199263 kubelet[2928]: E1124 06:49:32.199254 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.199391 kubelet[2928]: E1124 06:49:32.199336 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199391 kubelet[2928]: W1124 06:49:32.199341 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199391 kubelet[2928]: E1124 06:49:32.199347 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.199474 kubelet[2928]: E1124 06:49:32.199417 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199474 kubelet[2928]: W1124 06:49:32.199422 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199474 kubelet[2928]: E1124 06:49:32.199428 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.199548 kubelet[2928]: E1124 06:49:32.199496 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199548 kubelet[2928]: W1124 06:49:32.199502 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199548 kubelet[2928]: E1124 06:49:32.199510 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.199618 kubelet[2928]: E1124 06:49:32.199592 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.199618 kubelet[2928]: W1124 06:49:32.199597 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.199618 kubelet[2928]: E1124 06:49:32.199603 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.200932 kubelet[2928]: E1124 06:49:32.199688 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.200932 kubelet[2928]: W1124 06:49:32.199697 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.200932 kubelet[2928]: E1124 06:49:32.199704 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.200932 kubelet[2928]: E1124 06:49:32.199781 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.200932 kubelet[2928]: W1124 06:49:32.199787 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.200932 kubelet[2928]: E1124 06:49:32.199793 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.200932 kubelet[2928]: E1124 06:49:32.199885 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.200932 kubelet[2928]: W1124 06:49:32.199892 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.200932 kubelet[2928]: E1124 06:49:32.199898 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.200932 kubelet[2928]: E1124 06:49:32.199972 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.201935 containerd[1631]: time="2025-11-24T06:49:32.200854053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pc79c,Uid:b5e47b66-affd-44cd-8137-7196fbbb55d6,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:32.201972 kubelet[2928]: W1124 06:49:32.199977 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.201972 kubelet[2928]: E1124 06:49:32.199983 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.213036 containerd[1631]: time="2025-11-24T06:49:32.212999911Z" level=info msg="connecting to shim 199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7" address="unix:///run/containerd/s/0789f5073173d134575d63675fd8a0cd149e8447c02f5128267aaedbfbf8264d" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:32.236112 systemd[1]: Started cri-containerd-199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7.scope - libcontainer container 199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7. Nov 24 06:49:32.284777 containerd[1631]: time="2025-11-24T06:49:32.282061245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pc79c,Uid:b5e47b66-affd-44cd-8137-7196fbbb55d6,Namespace:calico-system,Attempt:0,} returns sandbox id \"199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7\"" Nov 24 06:49:32.300111 kubelet[2928]: E1124 06:49:32.300046 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.300111 kubelet[2928]: W1124 06:49:32.300062 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.300111 kubelet[2928]: E1124 06:49:32.300075 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.300433 kubelet[2928]: E1124 06:49:32.300393 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.300433 kubelet[2928]: W1124 06:49:32.300403 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.300433 kubelet[2928]: E1124 06:49:32.300415 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.301478 kubelet[2928]: E1124 06:49:32.300520 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.301478 kubelet[2928]: W1124 06:49:32.300527 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.301478 kubelet[2928]: E1124 06:49:32.300535 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.301478 kubelet[2928]: E1124 06:49:32.301295 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.301478 kubelet[2928]: W1124 06:49:32.301303 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.301478 kubelet[2928]: E1124 06:49:32.301326 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.301478 kubelet[2928]: E1124 06:49:32.301435 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.301478 kubelet[2928]: W1124 06:49:32.301439 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.301478 kubelet[2928]: E1124 06:49:32.301447 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.301835 kubelet[2928]: E1124 06:49:32.301783 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.301835 kubelet[2928]: W1124 06:49:32.301791 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.301835 kubelet[2928]: E1124 06:49:32.301802 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.302066 kubelet[2928]: E1124 06:49:32.302009 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.303218 kubelet[2928]: W1124 06:49:32.302146 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.303218 kubelet[2928]: E1124 06:49:32.302161 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.303824 kubelet[2928]: E1124 06:49:32.303483 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.303824 kubelet[2928]: W1124 06:49:32.303500 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.303968 kubelet[2928]: E1124 06:49:32.303923 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.304145 kubelet[2928]: E1124 06:49:32.304136 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.304145 kubelet[2928]: W1124 06:49:32.304258 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.304506 kubelet[2928]: E1124 06:49:32.304383 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.304630 kubelet[2928]: E1124 06:49:32.304601 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.304841 kubelet[2928]: W1124 06:49:32.304831 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.305084 kubelet[2928]: E1124 06:49:32.305007 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.305084 kubelet[2928]: E1124 06:49:32.305021 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.305084 kubelet[2928]: W1124 06:49:32.305029 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.305283 kubelet[2928]: E1124 06:49:32.305173 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.305447 kubelet[2928]: E1124 06:49:32.305429 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.305447 kubelet[2928]: W1124 06:49:32.305437 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.305735 kubelet[2928]: E1124 06:49:32.305702 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.306355 kubelet[2928]: W1124 06:49:32.306345 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.306436 kubelet[2928]: E1124 06:49:32.306325 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.306436 kubelet[2928]: E1124 06:49:32.306428 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.306593 kubelet[2928]: E1124 06:49:32.306586 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.306653 kubelet[2928]: W1124 06:49:32.306643 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.306737 kubelet[2928]: E1124 06:49:32.306710 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.307073 kubelet[2928]: E1124 06:49:32.307031 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.307073 kubelet[2928]: W1124 06:49:32.307042 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.307073 kubelet[2928]: E1124 06:49:32.307061 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.307367 kubelet[2928]: E1124 06:49:32.307355 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.307576 kubelet[2928]: W1124 06:49:32.307449 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.307576 kubelet[2928]: E1124 06:49:32.307467 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.307822 kubelet[2928]: E1124 06:49:32.307807 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.308027 kubelet[2928]: W1124 06:49:32.307913 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.308027 kubelet[2928]: E1124 06:49:32.307934 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.308324 kubelet[2928]: E1124 06:49:32.308314 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.308610 kubelet[2928]: W1124 06:49:32.308574 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.308610 kubelet[2928]: E1124 06:49:32.308605 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.309238 kubelet[2928]: E1124 06:49:32.308760 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.309238 kubelet[2928]: W1124 06:49:32.308768 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.309238 kubelet[2928]: E1124 06:49:32.308786 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.309549 kubelet[2928]: E1124 06:49:32.309480 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.309549 kubelet[2928]: W1124 06:49:32.309493 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.309549 kubelet[2928]: E1124 06:49:32.309517 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.309760 kubelet[2928]: E1124 06:49:32.309705 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.309760 kubelet[2928]: W1124 06:49:32.309713 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.309844 kubelet[2928]: E1124 06:49:32.309829 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.310009 kubelet[2928]: E1124 06:49:32.309931 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.310009 kubelet[2928]: W1124 06:49:32.309940 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.310317 kubelet[2928]: E1124 06:49:32.310282 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.310795 kubelet[2928]: E1124 06:49:32.310726 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.310795 kubelet[2928]: W1124 06:49:32.310737 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.310795 kubelet[2928]: E1124 06:49:32.310762 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.311053 kubelet[2928]: E1124 06:49:32.310972 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.311053 kubelet[2928]: W1124 06:49:32.310983 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.311053 kubelet[2928]: E1124 06:49:32.310999 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.312276 kubelet[2928]: E1124 06:49:32.311236 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.312276 kubelet[2928]: W1124 06:49:32.311245 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.312276 kubelet[2928]: E1124 06:49:32.311254 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:32.315655 kubelet[2928]: E1124 06:49:32.315589 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:32.315655 kubelet[2928]: W1124 06:49:32.315603 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:32.315655 kubelet[2928]: E1124 06:49:32.315618 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:33.391922 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2122504336.mount: Deactivated successfully. Nov 24 06:49:33.966262 containerd[1631]: time="2025-11-24T06:49:33.966236433Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:33.966941 containerd[1631]: time="2025-11-24T06:49:33.966914266Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Nov 24 06:49:33.967052 containerd[1631]: time="2025-11-24T06:49:33.967039821Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:33.968142 containerd[1631]: time="2025-11-24T06:49:33.968129340Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:33.968784 containerd[1631]: time="2025-11-24T06:49:33.968538127Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 1.880442792s" Nov 24 06:49:33.968784 containerd[1631]: time="2025-11-24T06:49:33.968558551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Nov 24 06:49:33.969273 containerd[1631]: time="2025-11-24T06:49:33.969012391Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Nov 24 06:49:33.982465 containerd[1631]: time="2025-11-24T06:49:33.982420180Z" level=info msg="CreateContainer within sandbox \"fd452722ab5b052da7ca7a9980a184b555f84e74b8f52cac8973c48c292438fe\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Nov 24 06:49:33.986739 containerd[1631]: time="2025-11-24T06:49:33.986629151Z" level=info msg="Container 09d059eea5c305bcf2c8ba77e17cfbab5e4e225b05b0b3378abbef20cb7a1802: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:33.990870 containerd[1631]: time="2025-11-24T06:49:33.990840051Z" level=info msg="CreateContainer within sandbox \"fd452722ab5b052da7ca7a9980a184b555f84e74b8f52cac8973c48c292438fe\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"09d059eea5c305bcf2c8ba77e17cfbab5e4e225b05b0b3378abbef20cb7a1802\"" Nov 24 06:49:33.992195 containerd[1631]: time="2025-11-24T06:49:33.992155438Z" level=info msg="StartContainer for \"09d059eea5c305bcf2c8ba77e17cfbab5e4e225b05b0b3378abbef20cb7a1802\"" Nov 24 06:49:33.994134 containerd[1631]: time="2025-11-24T06:49:33.994104645Z" level=info msg="connecting to shim 09d059eea5c305bcf2c8ba77e17cfbab5e4e225b05b0b3378abbef20cb7a1802" address="unix:///run/containerd/s/a7bba525b5253224328185540d91ff8f792574dcb3683e0c10e9b0eccab59763" protocol=ttrpc version=3 Nov 24 06:49:34.022326 systemd[1]: Started cri-containerd-09d059eea5c305bcf2c8ba77e17cfbab5e4e225b05b0b3378abbef20cb7a1802.scope - libcontainer container 09d059eea5c305bcf2c8ba77e17cfbab5e4e225b05b0b3378abbef20cb7a1802. Nov 24 06:49:34.072518 containerd[1631]: time="2025-11-24T06:49:34.072490625Z" level=info msg="StartContainer for \"09d059eea5c305bcf2c8ba77e17cfbab5e4e225b05b0b3378abbef20cb7a1802\" returns successfully" Nov 24 06:49:34.091893 kubelet[2928]: E1124 06:49:34.091857 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:34.209264 kubelet[2928]: E1124 06:49:34.209235 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.209264 kubelet[2928]: W1124 06:49:34.209258 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.209381 kubelet[2928]: E1124 06:49:34.209274 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.209483 kubelet[2928]: E1124 06:49:34.209467 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.209483 kubelet[2928]: W1124 06:49:34.209479 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.209575 kubelet[2928]: E1124 06:49:34.209488 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.209612 kubelet[2928]: E1124 06:49:34.209580 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.209612 kubelet[2928]: W1124 06:49:34.209585 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.209612 kubelet[2928]: E1124 06:49:34.209591 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.209720 kubelet[2928]: E1124 06:49:34.209707 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.209720 kubelet[2928]: W1124 06:49:34.209716 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.209810 kubelet[2928]: E1124 06:49:34.209722 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.209810 kubelet[2928]: E1124 06:49:34.209803 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.209810 kubelet[2928]: W1124 06:49:34.209808 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.209871 kubelet[2928]: E1124 06:49:34.209812 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.209918 kubelet[2928]: E1124 06:49:34.209904 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.209918 kubelet[2928]: W1124 06:49:34.209915 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210007 kubelet[2928]: E1124 06:49:34.209922 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210007 kubelet[2928]: E1124 06:49:34.209989 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.210007 kubelet[2928]: W1124 06:49:34.209994 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210007 kubelet[2928]: E1124 06:49:34.210000 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210321 kubelet[2928]: E1124 06:49:34.210311 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.210321 kubelet[2928]: W1124 06:49:34.210319 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210492 kubelet[2928]: E1124 06:49:34.210325 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210514 kubelet[2928]: E1124 06:49:34.210492 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.210514 kubelet[2928]: W1124 06:49:34.210498 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210514 kubelet[2928]: E1124 06:49:34.210506 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210651 kubelet[2928]: E1124 06:49:34.210590 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.210651 kubelet[2928]: W1124 06:49:34.210597 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210651 kubelet[2928]: E1124 06:49:34.210602 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210973 kubelet[2928]: E1124 06:49:34.210666 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.210973 kubelet[2928]: W1124 06:49:34.210670 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210973 kubelet[2928]: E1124 06:49:34.210675 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210973 kubelet[2928]: E1124 06:49:34.210766 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.210973 kubelet[2928]: W1124 06:49:34.210770 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210973 kubelet[2928]: E1124 06:49:34.210775 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210973 kubelet[2928]: E1124 06:49:34.210850 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.210973 kubelet[2928]: W1124 06:49:34.210854 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.210973 kubelet[2928]: E1124 06:49:34.210860 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.210973 kubelet[2928]: E1124 06:49:34.210969 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.217915 kubelet[2928]: W1124 06:49:34.210974 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.217915 kubelet[2928]: E1124 06:49:34.210979 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.217915 kubelet[2928]: E1124 06:49:34.211054 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.217915 kubelet[2928]: W1124 06:49:34.211059 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.217915 kubelet[2928]: E1124 06:49:34.211065 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.218173 kubelet[2928]: E1124 06:49:34.218005 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.218173 kubelet[2928]: W1124 06:49:34.218023 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.218173 kubelet[2928]: E1124 06:49:34.218036 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224449 kubelet[2928]: E1124 06:49:34.218379 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224449 kubelet[2928]: W1124 06:49:34.218384 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224449 kubelet[2928]: E1124 06:49:34.218414 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224449 kubelet[2928]: E1124 06:49:34.218529 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224449 kubelet[2928]: W1124 06:49:34.218536 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224449 kubelet[2928]: E1124 06:49:34.218547 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224449 kubelet[2928]: E1124 06:49:34.218666 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224449 kubelet[2928]: W1124 06:49:34.218672 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224449 kubelet[2928]: E1124 06:49:34.218679 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224449 kubelet[2928]: E1124 06:49:34.218773 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224631 kubelet[2928]: W1124 06:49:34.218778 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224631 kubelet[2928]: E1124 06:49:34.218783 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224631 kubelet[2928]: E1124 06:49:34.218899 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224631 kubelet[2928]: W1124 06:49:34.218903 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224631 kubelet[2928]: E1124 06:49:34.218912 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224631 kubelet[2928]: E1124 06:49:34.219047 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224631 kubelet[2928]: W1124 06:49:34.219052 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224631 kubelet[2928]: E1124 06:49:34.219064 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224631 kubelet[2928]: E1124 06:49:34.219171 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224631 kubelet[2928]: W1124 06:49:34.219175 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224787 kubelet[2928]: E1124 06:49:34.219181 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224787 kubelet[2928]: E1124 06:49:34.219348 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224787 kubelet[2928]: W1124 06:49:34.219352 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224787 kubelet[2928]: E1124 06:49:34.219357 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224787 kubelet[2928]: E1124 06:49:34.219456 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224787 kubelet[2928]: W1124 06:49:34.219461 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224787 kubelet[2928]: E1124 06:49:34.219467 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.224787 kubelet[2928]: E1124 06:49:34.219565 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.224787 kubelet[2928]: W1124 06:49:34.219569 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.224787 kubelet[2928]: E1124 06:49:34.219592 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.233044 kubelet[2928]: E1124 06:49:34.219716 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.233044 kubelet[2928]: W1124 06:49:34.219722 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.233044 kubelet[2928]: E1124 06:49:34.219732 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.233044 kubelet[2928]: E1124 06:49:34.219830 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.233044 kubelet[2928]: W1124 06:49:34.219835 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.233044 kubelet[2928]: E1124 06:49:34.219844 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.233044 kubelet[2928]: E1124 06:49:34.219921 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.233044 kubelet[2928]: W1124 06:49:34.219926 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.233044 kubelet[2928]: E1124 06:49:34.219936 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.233044 kubelet[2928]: E1124 06:49:34.220060 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.233207 kubelet[2928]: W1124 06:49:34.220066 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.233207 kubelet[2928]: E1124 06:49:34.220079 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.233207 kubelet[2928]: E1124 06:49:34.220710 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.233207 kubelet[2928]: W1124 06:49:34.220717 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.233207 kubelet[2928]: E1124 06:49:34.220731 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.233207 kubelet[2928]: E1124 06:49:34.220855 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.233207 kubelet[2928]: W1124 06:49:34.220870 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.233207 kubelet[2928]: E1124 06:49:34.220904 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:34.233207 kubelet[2928]: E1124 06:49:34.221084 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:34.233207 kubelet[2928]: W1124 06:49:34.221089 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:34.233423 kubelet[2928]: E1124 06:49:34.221095 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.188151 kubelet[2928]: I1124 06:49:35.188033 2928 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 06:49:35.214803 kubelet[2928]: E1124 06:49:35.214751 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.214803 kubelet[2928]: W1124 06:49:35.214766 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.214803 kubelet[2928]: E1124 06:49:35.214778 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.215043 kubelet[2928]: E1124 06:49:35.215008 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.215043 kubelet[2928]: W1124 06:49:35.215014 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.215043 kubelet[2928]: E1124 06:49:35.215024 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.215202 kubelet[2928]: E1124 06:49:35.215171 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.215202 kubelet[2928]: W1124 06:49:35.215176 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.215202 kubelet[2928]: E1124 06:49:35.215182 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.215344 kubelet[2928]: E1124 06:49:35.215339 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.215403 kubelet[2928]: W1124 06:49:35.215372 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.215403 kubelet[2928]: E1124 06:49:35.215379 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.215540 kubelet[2928]: E1124 06:49:35.215512 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.215540 kubelet[2928]: W1124 06:49:35.215518 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.215540 kubelet[2928]: E1124 06:49:35.215523 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.215700 kubelet[2928]: E1124 06:49:35.215672 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.215700 kubelet[2928]: W1124 06:49:35.215677 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.215700 kubelet[2928]: E1124 06:49:35.215682 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.215860 kubelet[2928]: E1124 06:49:35.215825 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.215860 kubelet[2928]: W1124 06:49:35.215831 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.215860 kubelet[2928]: E1124 06:49:35.215839 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.216020 kubelet[2928]: E1124 06:49:35.215988 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.216020 kubelet[2928]: W1124 06:49:35.215993 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.216020 kubelet[2928]: E1124 06:49:35.215998 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.216173 kubelet[2928]: E1124 06:49:35.216145 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.216173 kubelet[2928]: W1124 06:49:35.216151 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.216173 kubelet[2928]: E1124 06:49:35.216156 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.216351 kubelet[2928]: E1124 06:49:35.216316 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.216351 kubelet[2928]: W1124 06:49:35.216322 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.216351 kubelet[2928]: E1124 06:49:35.216327 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.216510 kubelet[2928]: E1124 06:49:35.216478 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.216510 kubelet[2928]: W1124 06:49:35.216483 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.216510 kubelet[2928]: E1124 06:49:35.216488 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.216675 kubelet[2928]: E1124 06:49:35.216640 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.216675 kubelet[2928]: W1124 06:49:35.216646 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.216675 kubelet[2928]: E1124 06:49:35.216650 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.216852 kubelet[2928]: E1124 06:49:35.216799 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.216852 kubelet[2928]: W1124 06:49:35.216805 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.216852 kubelet[2928]: E1124 06:49:35.216814 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.216999 kubelet[2928]: E1124 06:49:35.216971 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.216999 kubelet[2928]: W1124 06:49:35.216977 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.216999 kubelet[2928]: E1124 06:49:35.216981 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.217170 kubelet[2928]: E1124 06:49:35.217129 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.217170 kubelet[2928]: W1124 06:49:35.217136 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.217170 kubelet[2928]: E1124 06:49:35.217141 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.225313 kubelet[2928]: E1124 06:49:35.225302 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.225313 kubelet[2928]: W1124 06:49:35.225312 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.225385 kubelet[2928]: E1124 06:49:35.225320 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.225493 kubelet[2928]: E1124 06:49:35.225423 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.225493 kubelet[2928]: W1124 06:49:35.225443 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.225493 kubelet[2928]: E1124 06:49:35.225450 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.225614 kubelet[2928]: E1124 06:49:35.225599 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.225614 kubelet[2928]: W1124 06:49:35.225606 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.225714 kubelet[2928]: E1124 06:49:35.225658 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.225774 kubelet[2928]: E1124 06:49:35.225769 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.225807 kubelet[2928]: W1124 06:49:35.225802 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.225842 kubelet[2928]: E1124 06:49:35.225836 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.225991 kubelet[2928]: E1124 06:49:35.225944 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.225991 kubelet[2928]: W1124 06:49:35.225950 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.225991 kubelet[2928]: E1124 06:49:35.225958 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.226164 kubelet[2928]: E1124 06:49:35.226158 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.226239 kubelet[2928]: W1124 06:49:35.226190 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.226239 kubelet[2928]: E1124 06:49:35.226201 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.226706 kubelet[2928]: E1124 06:49:35.226688 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.226706 kubelet[2928]: W1124 06:49:35.226697 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.226753 kubelet[2928]: E1124 06:49:35.226717 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.226922 kubelet[2928]: E1124 06:49:35.226913 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.226944 kubelet[2928]: W1124 06:49:35.226925 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.226944 kubelet[2928]: E1124 06:49:35.226931 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.227029 kubelet[2928]: E1124 06:49:35.227011 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.227050 kubelet[2928]: W1124 06:49:35.227028 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.227050 kubelet[2928]: E1124 06:49:35.227035 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.227115 kubelet[2928]: E1124 06:49:35.227107 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.227115 kubelet[2928]: W1124 06:49:35.227113 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.227188 kubelet[2928]: E1124 06:49:35.227180 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.227287 kubelet[2928]: E1124 06:49:35.227277 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.227287 kubelet[2928]: W1124 06:49:35.227284 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.227332 kubelet[2928]: E1124 06:49:35.227289 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.227478 kubelet[2928]: E1124 06:49:35.227467 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.227478 kubelet[2928]: W1124 06:49:35.227474 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.227522 kubelet[2928]: E1124 06:49:35.227482 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.227731 kubelet[2928]: E1124 06:49:35.227720 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.227757 kubelet[2928]: W1124 06:49:35.227739 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.227757 kubelet[2928]: E1124 06:49:35.227752 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.227842 kubelet[2928]: E1124 06:49:35.227833 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.227842 kubelet[2928]: W1124 06:49:35.227840 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.227880 kubelet[2928]: E1124 06:49:35.227848 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.227941 kubelet[2928]: E1124 06:49:35.227933 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.227963 kubelet[2928]: W1124 06:49:35.227940 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.227963 kubelet[2928]: E1124 06:49:35.227947 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.228065 kubelet[2928]: E1124 06:49:35.228056 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.228065 kubelet[2928]: W1124 06:49:35.228063 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.228102 kubelet[2928]: E1124 06:49:35.228068 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.228373 kubelet[2928]: E1124 06:49:35.228359 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.228373 kubelet[2928]: W1124 06:49:35.228369 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.228373 kubelet[2928]: E1124 06:49:35.228375 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.230143 kubelet[2928]: E1124 06:49:35.228557 2928 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:49:35.230143 kubelet[2928]: W1124 06:49:35.228582 2928 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:49:35.230143 kubelet[2928]: E1124 06:49:35.228589 2928 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:49:35.235189 containerd[1631]: time="2025-11-24T06:49:35.235160743Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:35.238454 containerd[1631]: time="2025-11-24T06:49:35.238382729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Nov 24 06:49:35.238763 containerd[1631]: time="2025-11-24T06:49:35.238734853Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:35.239756 containerd[1631]: time="2025-11-24T06:49:35.239732733Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:35.240346 containerd[1631]: time="2025-11-24T06:49:35.240087530Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.271056711s" Nov 24 06:49:35.240346 containerd[1631]: time="2025-11-24T06:49:35.240109705Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Nov 24 06:49:35.242394 containerd[1631]: time="2025-11-24T06:49:35.242218020Z" level=info msg="CreateContainer within sandbox \"199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Nov 24 06:49:35.249702 containerd[1631]: time="2025-11-24T06:49:35.248419993Z" level=info msg="Container 6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:35.250884 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3632939974.mount: Deactivated successfully. Nov 24 06:49:35.256053 containerd[1631]: time="2025-11-24T06:49:35.256021749Z" level=info msg="CreateContainer within sandbox \"199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d\"" Nov 24 06:49:35.256745 containerd[1631]: time="2025-11-24T06:49:35.256698701Z" level=info msg="StartContainer for \"6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d\"" Nov 24 06:49:35.257745 containerd[1631]: time="2025-11-24T06:49:35.257731149Z" level=info msg="connecting to shim 6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d" address="unix:///run/containerd/s/0789f5073173d134575d63675fd8a0cd149e8447c02f5128267aaedbfbf8264d" protocol=ttrpc version=3 Nov 24 06:49:35.279333 systemd[1]: Started cri-containerd-6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d.scope - libcontainer container 6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d. Nov 24 06:49:35.352158 systemd[1]: cri-containerd-6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d.scope: Deactivated successfully. Nov 24 06:49:35.370873 containerd[1631]: time="2025-11-24T06:49:35.370315447Z" level=info msg="received container exit event container_id:\"6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d\" id:\"6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d\" pid:3635 exited_at:{seconds:1763966975 nanos:355627894}" Nov 24 06:49:35.370873 containerd[1631]: time="2025-11-24T06:49:35.370430765Z" level=info msg="StartContainer for \"6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d\" returns successfully" Nov 24 06:49:35.413994 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6537443241daad83b94ed2985bd91ee6c780dcd4cf1426a51626f309142ade6d-rootfs.mount: Deactivated successfully. Nov 24 06:49:36.092456 kubelet[2928]: E1124 06:49:36.092388 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:36.192684 containerd[1631]: time="2025-11-24T06:49:36.192637090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Nov 24 06:49:36.205237 kubelet[2928]: I1124 06:49:36.205171 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-b4d889c99-psfk9" podStartSLOduration=3.324068443 podStartE2EDuration="5.205157563s" podCreationTimestamp="2025-11-24 06:49:31 +0000 UTC" firstStartedPulling="2025-11-24 06:49:32.087859096 +0000 UTC m=+18.159312711" lastFinishedPulling="2025-11-24 06:49:33.968948215 +0000 UTC m=+20.040401831" observedRunningTime="2025-11-24 06:49:34.204443673 +0000 UTC m=+20.275897315" watchObservedRunningTime="2025-11-24 06:49:36.205157563 +0000 UTC m=+22.276611192" Nov 24 06:49:38.092259 kubelet[2928]: E1124 06:49:38.091667 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:40.092263 kubelet[2928]: E1124 06:49:40.092211 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:40.290998 containerd[1631]: time="2025-11-24T06:49:40.290434850Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:40.291582 containerd[1631]: time="2025-11-24T06:49:40.291566459Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Nov 24 06:49:40.292178 containerd[1631]: time="2025-11-24T06:49:40.292160551Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:40.294359 containerd[1631]: time="2025-11-24T06:49:40.294341880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:40.294851 containerd[1631]: time="2025-11-24T06:49:40.294669342Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.102004331s" Nov 24 06:49:40.295003 containerd[1631]: time="2025-11-24T06:49:40.294920876Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Nov 24 06:49:40.298374 containerd[1631]: time="2025-11-24T06:49:40.298341735Z" level=info msg="CreateContainer within sandbox \"199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Nov 24 06:49:40.307522 containerd[1631]: time="2025-11-24T06:49:40.307486262Z" level=info msg="Container ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:40.315034 containerd[1631]: time="2025-11-24T06:49:40.315002764Z" level=info msg="CreateContainer within sandbox \"199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9\"" Nov 24 06:49:40.315618 containerd[1631]: time="2025-11-24T06:49:40.315494518Z" level=info msg="StartContainer for \"ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9\"" Nov 24 06:49:40.317684 containerd[1631]: time="2025-11-24T06:49:40.317658221Z" level=info msg="connecting to shim ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9" address="unix:///run/containerd/s/0789f5073173d134575d63675fd8a0cd149e8447c02f5128267aaedbfbf8264d" protocol=ttrpc version=3 Nov 24 06:49:40.343345 systemd[1]: Started cri-containerd-ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9.scope - libcontainer container ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9. Nov 24 06:49:40.391797 containerd[1631]: time="2025-11-24T06:49:40.391745635Z" level=info msg="StartContainer for \"ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9\" returns successfully" Nov 24 06:49:42.005358 systemd[1]: cri-containerd-ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9.scope: Deactivated successfully. Nov 24 06:49:42.006281 systemd[1]: cri-containerd-ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9.scope: Consumed 322ms CPU time, 160.6M memory peak, 2.2M read from disk, 171.3M written to disk. Nov 24 06:49:42.013675 containerd[1631]: time="2025-11-24T06:49:42.013401575Z" level=info msg="received container exit event container_id:\"ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9\" id:\"ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9\" pid:3695 exited_at:{seconds:1763966982 nanos:5892600}" Nov 24 06:49:42.053075 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ba0848295e56e2162cf83c40eeb57901e7848d87200e0d00274886708a7475d9-rootfs.mount: Deactivated successfully. Nov 24 06:49:42.100260 kubelet[2928]: E1124 06:49:42.096511 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:42.126499 kubelet[2928]: I1124 06:49:42.126478 2928 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Nov 24 06:49:42.361586 systemd[1]: Created slice kubepods-burstable-pod18595df3_20a1_4082_ac7b_f679e33292aa.slice - libcontainer container kubepods-burstable-pod18595df3_20a1_4082_ac7b_f679e33292aa.slice. Nov 24 06:49:42.366446 systemd[1]: Created slice kubepods-besteffort-podbc90fb97_b904_4a7f_ac89_30f34f24cf82.slice - libcontainer container kubepods-besteffort-podbc90fb97_b904_4a7f_ac89_30f34f24cf82.slice. Nov 24 06:49:42.369974 systemd[1]: Created slice kubepods-besteffort-podf9c4aeec_8d56_49ab_910e_5dc9d27b3e29.slice - libcontainer container kubepods-besteffort-podf9c4aeec_8d56_49ab_910e_5dc9d27b3e29.slice. Nov 24 06:49:42.373859 systemd[1]: Created slice kubepods-besteffort-pod3ac853da_b498_4eb2_aacf_2ea6168a1205.slice - libcontainer container kubepods-besteffort-pod3ac853da_b498_4eb2_aacf_2ea6168a1205.slice. Nov 24 06:49:42.376603 kubelet[2928]: I1124 06:49:42.376570 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f9c4aeec-8d56-49ab-910e-5dc9d27b3e29-calico-apiserver-certs\") pod \"calico-apiserver-7887855f8c-7hjcp\" (UID: \"f9c4aeec-8d56-49ab-910e-5dc9d27b3e29\") " pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" Nov 24 06:49:42.376603 kubelet[2928]: I1124 06:49:42.376588 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8l6\" (UniqueName: \"kubernetes.io/projected/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-kube-api-access-kd8l6\") pod \"goldmane-666569f655-mxk2w\" (UID: \"57b14474-5edf-4409-a6bd-e5a9f7dc6f4e\") " pod="calico-system/goldmane-666569f655-mxk2w" Nov 24 06:49:42.376603 kubelet[2928]: I1124 06:49:42.376601 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/54ddf3ce-7798-43d6-964a-ec131b6bd310-calico-apiserver-certs\") pod \"calico-apiserver-66994bd4cb-j58dd\" (UID: \"54ddf3ce-7798-43d6-964a-ec131b6bd310\") " pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" Nov 24 06:49:42.376687 kubelet[2928]: I1124 06:49:42.376610 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzb4\" (UniqueName: \"kubernetes.io/projected/54ddf3ce-7798-43d6-964a-ec131b6bd310-kube-api-access-jrzb4\") pod \"calico-apiserver-66994bd4cb-j58dd\" (UID: \"54ddf3ce-7798-43d6-964a-ec131b6bd310\") " pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" Nov 24 06:49:42.376687 kubelet[2928]: I1124 06:49:42.376620 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmxh\" (UniqueName: \"kubernetes.io/projected/3ac853da-b498-4eb2-aacf-2ea6168a1205-kube-api-access-4wmxh\") pod \"calico-apiserver-7887855f8c-x6nck\" (UID: \"3ac853da-b498-4eb2-aacf-2ea6168a1205\") " pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" Nov 24 06:49:42.376687 kubelet[2928]: I1124 06:49:42.376631 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn4d4\" (UniqueName: \"kubernetes.io/projected/bc90fb97-b904-4a7f-ac89-30f34f24cf82-kube-api-access-fn4d4\") pod \"whisker-7bbb894f64-8954x\" (UID: \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\") " pod="calico-system/whisker-7bbb894f64-8954x" Nov 24 06:49:42.376687 kubelet[2928]: I1124 06:49:42.376639 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqgvn\" (UniqueName: \"kubernetes.io/projected/a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68-kube-api-access-sqgvn\") pod \"calico-kube-controllers-7cdc8946dd-7dnrd\" (UID: \"a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68\") " pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" Nov 24 06:49:42.376687 kubelet[2928]: I1124 06:49:42.376651 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb5zr\" (UniqueName: \"kubernetes.io/projected/18595df3-20a1-4082-ac7b-f679e33292aa-kube-api-access-lb5zr\") pod \"coredns-668d6bf9bc-jqtdz\" (UID: \"18595df3-20a1-4082-ac7b-f679e33292aa\") " pod="kube-system/coredns-668d6bf9bc-jqtdz" Nov 24 06:49:42.381844 kubelet[2928]: I1124 06:49:42.376660 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-goldmane-ca-bundle\") pod \"goldmane-666569f655-mxk2w\" (UID: \"57b14474-5edf-4409-a6bd-e5a9f7dc6f4e\") " pod="calico-system/goldmane-666569f655-mxk2w" Nov 24 06:49:42.381844 kubelet[2928]: I1124 06:49:42.376669 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68-tigera-ca-bundle\") pod \"calico-kube-controllers-7cdc8946dd-7dnrd\" (UID: \"a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68\") " pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" Nov 24 06:49:42.381844 kubelet[2928]: I1124 06:49:42.376680 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3ac853da-b498-4eb2-aacf-2ea6168a1205-calico-apiserver-certs\") pod \"calico-apiserver-7887855f8c-x6nck\" (UID: \"3ac853da-b498-4eb2-aacf-2ea6168a1205\") " pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" Nov 24 06:49:42.381844 kubelet[2928]: I1124 06:49:42.376690 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-ca-bundle\") pod \"whisker-7bbb894f64-8954x\" (UID: \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\") " pod="calico-system/whisker-7bbb894f64-8954x" Nov 24 06:49:42.381844 kubelet[2928]: I1124 06:49:42.376701 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-config\") pod \"goldmane-666569f655-mxk2w\" (UID: \"57b14474-5edf-4409-a6bd-e5a9f7dc6f4e\") " pod="calico-system/goldmane-666569f655-mxk2w" Nov 24 06:49:42.377430 systemd[1]: Created slice kubepods-besteffort-poda2bb8cf5_1cc1_40de_90a6_0c1bddb1bf68.slice - libcontainer container kubepods-besteffort-poda2bb8cf5_1cc1_40de_90a6_0c1bddb1bf68.slice. Nov 24 06:49:42.381988 kubelet[2928]: I1124 06:49:42.376710 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-goldmane-key-pair\") pod \"goldmane-666569f655-mxk2w\" (UID: \"57b14474-5edf-4409-a6bd-e5a9f7dc6f4e\") " pod="calico-system/goldmane-666569f655-mxk2w" Nov 24 06:49:42.381988 kubelet[2928]: I1124 06:49:42.376725 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-backend-key-pair\") pod \"whisker-7bbb894f64-8954x\" (UID: \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\") " pod="calico-system/whisker-7bbb894f64-8954x" Nov 24 06:49:42.381988 kubelet[2928]: I1124 06:49:42.376735 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlhhs\" (UniqueName: \"kubernetes.io/projected/f9c4aeec-8d56-49ab-910e-5dc9d27b3e29-kube-api-access-wlhhs\") pod \"calico-apiserver-7887855f8c-7hjcp\" (UID: \"f9c4aeec-8d56-49ab-910e-5dc9d27b3e29\") " pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" Nov 24 06:49:42.381988 kubelet[2928]: I1124 06:49:42.376744 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18595df3-20a1-4082-ac7b-f679e33292aa-config-volume\") pod \"coredns-668d6bf9bc-jqtdz\" (UID: \"18595df3-20a1-4082-ac7b-f679e33292aa\") " pod="kube-system/coredns-668d6bf9bc-jqtdz" Nov 24 06:49:42.380495 systemd[1]: Created slice kubepods-besteffort-pod57b14474_5edf_4409_a6bd_e5a9f7dc6f4e.slice - libcontainer container kubepods-besteffort-pod57b14474_5edf_4409_a6bd_e5a9f7dc6f4e.slice. Nov 24 06:49:42.385522 systemd[1]: Created slice kubepods-besteffort-pod54ddf3ce_7798_43d6_964a_ec131b6bd310.slice - libcontainer container kubepods-besteffort-pod54ddf3ce_7798_43d6_964a_ec131b6bd310.slice. Nov 24 06:49:42.405744 kubelet[2928]: W1124 06:49:42.405607 2928 reflector.go:569] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Nov 24 06:49:42.407448 kubelet[2928]: E1124 06:49:42.407387 2928 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:42.410896 kubelet[2928]: W1124 06:49:42.410835 2928 reflector.go:569] object-"calico-system"/"goldmane-ca-bundle": failed to list *v1.ConfigMap: configmaps "goldmane-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Nov 24 06:49:42.411146 kubelet[2928]: E1124 06:49:42.411063 2928 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane-ca-bundle\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:42.411289 kubelet[2928]: W1124 06:49:42.411114 2928 reflector.go:569] object-"calico-system"/"goldmane": failed to list *v1.ConfigMap: configmaps "goldmane" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Nov 24 06:49:42.411360 kubelet[2928]: E1124 06:49:42.411337 2928 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:42.411747 kubelet[2928]: W1124 06:49:42.411691 2928 reflector.go:569] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Nov 24 06:49:42.411747 kubelet[2928]: E1124 06:49:42.411706 2928 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:42.411901 kubelet[2928]: W1124 06:49:42.411841 2928 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Nov 24 06:49:42.411901 kubelet[2928]: E1124 06:49:42.411856 2928 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:42.411901 kubelet[2928]: W1124 06:49:42.411880 2928 reflector.go:569] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Nov 24 06:49:42.412078 kubelet[2928]: W1124 06:49:42.411991 2928 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Nov 24 06:49:42.412078 kubelet[2928]: E1124 06:49:42.412005 2928 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:42.412292 kubelet[2928]: E1124 06:49:42.411888 2928 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Nov 24 06:49:42.418443 systemd[1]: Created slice kubepods-burstable-pod8d5ea9ae_70a1_4d7a_a9c6_6b6b766f4ac9.slice - libcontainer container kubepods-burstable-pod8d5ea9ae_70a1_4d7a_a9c6_6b6b766f4ac9.slice. Nov 24 06:49:42.477843 kubelet[2928]: I1124 06:49:42.477813 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9-config-volume\") pod \"coredns-668d6bf9bc-jmbwk\" (UID: \"8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9\") " pod="kube-system/coredns-668d6bf9bc-jmbwk" Nov 24 06:49:42.477943 kubelet[2928]: I1124 06:49:42.477895 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gph\" (UniqueName: \"kubernetes.io/projected/8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9-kube-api-access-m2gph\") pod \"coredns-668d6bf9bc-jmbwk\" (UID: \"8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9\") " pod="kube-system/coredns-668d6bf9bc-jmbwk" Nov 24 06:49:42.665199 containerd[1631]: time="2025-11-24T06:49:42.664582415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jqtdz,Uid:18595df3-20a1-4082-ac7b-f679e33292aa,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:42.680035 containerd[1631]: time="2025-11-24T06:49:42.679838452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cdc8946dd-7dnrd,Uid:a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:42.722256 containerd[1631]: time="2025-11-24T06:49:42.722235581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jmbwk,Uid:8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:43.070013 containerd[1631]: time="2025-11-24T06:49:43.069980228Z" level=error msg="Failed to destroy network for sandbox \"cd5191f7c35c6741ed3c8326b53b97b5a317940d862f0569766f2e3bcce84e35\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.071974 containerd[1631]: time="2025-11-24T06:49:43.071745237Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jmbwk,Uid:8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd5191f7c35c6741ed3c8326b53b97b5a317940d862f0569766f2e3bcce84e35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.072161 systemd[1]: run-netns-cni\x2d900122e3\x2d292d\x2db5f2\x2d1761\x2d2487c1bfeaa4.mount: Deactivated successfully. Nov 24 06:49:43.073893 kubelet[2928]: E1124 06:49:43.073623 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd5191f7c35c6741ed3c8326b53b97b5a317940d862f0569766f2e3bcce84e35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.073893 kubelet[2928]: E1124 06:49:43.073671 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd5191f7c35c6741ed3c8326b53b97b5a317940d862f0569766f2e3bcce84e35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jmbwk" Nov 24 06:49:43.073893 kubelet[2928]: E1124 06:49:43.073684 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd5191f7c35c6741ed3c8326b53b97b5a317940d862f0569766f2e3bcce84e35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jmbwk" Nov 24 06:49:43.073986 kubelet[2928]: E1124 06:49:43.073714 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-jmbwk_kube-system(8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-jmbwk_kube-system(8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cd5191f7c35c6741ed3c8326b53b97b5a317940d862f0569766f2e3bcce84e35\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-jmbwk" podUID="8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9" Nov 24 06:49:43.086100 containerd[1631]: time="2025-11-24T06:49:43.086062883Z" level=error msg="Failed to destroy network for sandbox \"8b35914a40eb8b37a9fb2d13c60bda8c700ec02fc657a45f4bcdf7e3c53ae8d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.088927 systemd[1]: run-netns-cni\x2d186cf427\x2dae37\x2d8063\x2d91a6\x2df0ba451a98c1.mount: Deactivated successfully. Nov 24 06:49:43.090673 containerd[1631]: time="2025-11-24T06:49:43.089332716Z" level=error msg="Failed to destroy network for sandbox \"ce4ef937e0c3cb099e54a1298c49b8e7a232d8a4b8809da8e0a5e336aab28085\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.090728 containerd[1631]: time="2025-11-24T06:49:43.090706229Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jqtdz,Uid:18595df3-20a1-4082-ac7b-f679e33292aa,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce4ef937e0c3cb099e54a1298c49b8e7a232d8a4b8809da8e0a5e336aab28085\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.091072 containerd[1631]: time="2025-11-24T06:49:43.091050762Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cdc8946dd-7dnrd,Uid:a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b35914a40eb8b37a9fb2d13c60bda8c700ec02fc657a45f4bcdf7e3c53ae8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.091804 kubelet[2928]: E1124 06:49:43.091147 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce4ef937e0c3cb099e54a1298c49b8e7a232d8a4b8809da8e0a5e336aab28085\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.091804 kubelet[2928]: E1124 06:49:43.091180 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce4ef937e0c3cb099e54a1298c49b8e7a232d8a4b8809da8e0a5e336aab28085\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jqtdz" Nov 24 06:49:43.091804 kubelet[2928]: E1124 06:49:43.091193 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce4ef937e0c3cb099e54a1298c49b8e7a232d8a4b8809da8e0a5e336aab28085\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jqtdz" Nov 24 06:49:43.091804 kubelet[2928]: E1124 06:49:43.091146 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b35914a40eb8b37a9fb2d13c60bda8c700ec02fc657a45f4bcdf7e3c53ae8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:43.091962 kubelet[2928]: E1124 06:49:43.091241 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b35914a40eb8b37a9fb2d13c60bda8c700ec02fc657a45f4bcdf7e3c53ae8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" Nov 24 06:49:43.091962 kubelet[2928]: E1124 06:49:43.091252 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b35914a40eb8b37a9fb2d13c60bda8c700ec02fc657a45f4bcdf7e3c53ae8d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" Nov 24 06:49:43.091962 kubelet[2928]: E1124 06:49:43.091270 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cdc8946dd-7dnrd_calico-system(a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cdc8946dd-7dnrd_calico-system(a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8b35914a40eb8b37a9fb2d13c60bda8c700ec02fc657a45f4bcdf7e3c53ae8d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:49:43.092090 kubelet[2928]: E1124 06:49:43.091755 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-jqtdz_kube-system(18595df3-20a1-4082-ac7b-f679e33292aa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-jqtdz_kube-system(18595df3-20a1-4082-ac7b-f679e33292aa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ce4ef937e0c3cb099e54a1298c49b8e7a232d8a4b8809da8e0a5e336aab28085\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-jqtdz" podUID="18595df3-20a1-4082-ac7b-f679e33292aa" Nov 24 06:49:43.092169 systemd[1]: run-netns-cni\x2da48a7f53\x2d843a\x2dc319\x2d4456\x2d67ac276d8d00.mount: Deactivated successfully. Nov 24 06:49:43.235943 containerd[1631]: time="2025-11-24T06:49:43.235914741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Nov 24 06:49:43.479246 kubelet[2928]: E1124 06:49:43.479070 2928 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.479246 kubelet[2928]: E1124 06:49:43.479140 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-ca-bundle podName:bc90fb97-b904-4a7f-ac89-30f34f24cf82 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.979113795 +0000 UTC m=+30.050567414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-ca-bundle") pod "whisker-7bbb894f64-8954x" (UID: "bc90fb97-b904-4a7f-ac89-30f34f24cf82") : failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.479937 kubelet[2928]: E1124 06:49:43.479707 2928 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.479937 kubelet[2928]: E1124 06:49:43.479721 2928 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.479937 kubelet[2928]: E1124 06:49:43.479748 2928 configmap.go:193] Couldn't get configMap calico-system/goldmane: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.479937 kubelet[2928]: E1124 06:49:43.479749 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c4aeec-8d56-49ab-910e-5dc9d27b3e29-calico-apiserver-certs podName:f9c4aeec-8d56-49ab-910e-5dc9d27b3e29 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.979741822 +0000 UTC m=+30.051195440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/f9c4aeec-8d56-49ab-910e-5dc9d27b3e29-calico-apiserver-certs") pod "calico-apiserver-7887855f8c-7hjcp" (UID: "f9c4aeec-8d56-49ab-910e-5dc9d27b3e29") : failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.479937 kubelet[2928]: E1124 06:49:43.479760 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54ddf3ce-7798-43d6-964a-ec131b6bd310-calico-apiserver-certs podName:54ddf3ce-7798-43d6-964a-ec131b6bd310 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.979755695 +0000 UTC m=+30.051209313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/54ddf3ce-7798-43d6-964a-ec131b6bd310-calico-apiserver-certs") pod "calico-apiserver-66994bd4cb-j58dd" (UID: "54ddf3ce-7798-43d6-964a-ec131b6bd310") : failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.479937 kubelet[2928]: E1124 06:49:43.479769 2928 configmap.go:193] Couldn't get configMap calico-system/goldmane-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.480783 kubelet[2928]: E1124 06:49:43.479783 2928 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.480783 kubelet[2928]: E1124 06:49:43.479783 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-config podName:57b14474-5edf-4409-a6bd-e5a9f7dc6f4e nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.979768646 +0000 UTC m=+30.051222262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-config") pod "goldmane-666569f655-mxk2w" (UID: "57b14474-5edf-4409-a6bd-e5a9f7dc6f4e") : failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.480783 kubelet[2928]: E1124 06:49:43.479792 2928 secret.go:189] Couldn't get secret calico-system/goldmane-key-pair: failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.480783 kubelet[2928]: E1124 06:49:43.479799 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-goldmane-ca-bundle podName:57b14474-5edf-4409-a6bd-e5a9f7dc6f4e nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.979793726 +0000 UTC m=+30.051247340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-ca-bundle" (UniqueName: "kubernetes.io/configmap/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-goldmane-ca-bundle") pod "goldmane-666569f655-mxk2w" (UID: "57b14474-5edf-4409-a6bd-e5a9f7dc6f4e") : failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.481462 kubelet[2928]: E1124 06:49:43.479810 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ac853da-b498-4eb2-aacf-2ea6168a1205-calico-apiserver-certs podName:3ac853da-b498-4eb2-aacf-2ea6168a1205 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.979803091 +0000 UTC m=+30.051256704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/3ac853da-b498-4eb2-aacf-2ea6168a1205-calico-apiserver-certs") pod "calico-apiserver-7887855f8c-x6nck" (UID: "3ac853da-b498-4eb2-aacf-2ea6168a1205") : failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.481462 kubelet[2928]: E1124 06:49:43.479816 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-goldmane-key-pair podName:57b14474-5edf-4409-a6bd-e5a9f7dc6f4e nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.979812913 +0000 UTC m=+30.051266526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-key-pair" (UniqueName: "kubernetes.io/secret/57b14474-5edf-4409-a6bd-e5a9f7dc6f4e-goldmane-key-pair") pod "goldmane-666569f655-mxk2w" (UID: "57b14474-5edf-4409-a6bd-e5a9f7dc6f4e") : failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.481462 kubelet[2928]: E1124 06:49:43.479818 2928 secret.go:189] Couldn't get secret calico-system/whisker-backend-key-pair: failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.481552 kubelet[2928]: E1124 06:49:43.479857 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-backend-key-pair podName:bc90fb97-b904-4a7f-ac89-30f34f24cf82 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.97982684 +0000 UTC m=+30.051280453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-backend-key-pair" (UniqueName: "kubernetes.io/secret/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-backend-key-pair") pod "whisker-7bbb894f64-8954x" (UID: "bc90fb97-b904-4a7f-ac89-30f34f24cf82") : failed to sync secret cache: timed out waiting for the condition Nov 24 06:49:43.483190 kubelet[2928]: E1124 06:49:43.483117 2928 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.483190 kubelet[2928]: E1124 06:49:43.483134 2928 projected.go:194] Error preparing data for projected volume kube-api-access-wlhhs for pod calico-apiserver/calico-apiserver-7887855f8c-7hjcp: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.483190 kubelet[2928]: E1124 06:49:43.483173 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9c4aeec-8d56-49ab-910e-5dc9d27b3e29-kube-api-access-wlhhs podName:f9c4aeec-8d56-49ab-910e-5dc9d27b3e29 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.983165232 +0000 UTC m=+30.054618849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wlhhs" (UniqueName: "kubernetes.io/projected/f9c4aeec-8d56-49ab-910e-5dc9d27b3e29-kube-api-access-wlhhs") pod "calico-apiserver-7887855f8c-7hjcp" (UID: "f9c4aeec-8d56-49ab-910e-5dc9d27b3e29") : failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.496272 kubelet[2928]: E1124 06:49:43.496243 2928 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.496272 kubelet[2928]: E1124 06:49:43.496267 2928 projected.go:194] Error preparing data for projected volume kube-api-access-jrzb4 for pod calico-apiserver/calico-apiserver-66994bd4cb-j58dd: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.496384 kubelet[2928]: E1124 06:49:43.496312 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54ddf3ce-7798-43d6-964a-ec131b6bd310-kube-api-access-jrzb4 podName:54ddf3ce-7798-43d6-964a-ec131b6bd310 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.996286462 +0000 UTC m=+30.067740079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jrzb4" (UniqueName: "kubernetes.io/projected/54ddf3ce-7798-43d6-964a-ec131b6bd310-kube-api-access-jrzb4") pod "calico-apiserver-66994bd4cb-j58dd" (UID: "54ddf3ce-7798-43d6-964a-ec131b6bd310") : failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.496496 kubelet[2928]: E1124 06:49:43.496243 2928 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.496496 kubelet[2928]: E1124 06:49:43.496456 2928 projected.go:194] Error preparing data for projected volume kube-api-access-4wmxh for pod calico-apiserver/calico-apiserver-7887855f8c-x6nck: failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:43.496496 kubelet[2928]: E1124 06:49:43.496477 2928 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ac853da-b498-4eb2-aacf-2ea6168a1205-kube-api-access-4wmxh podName:3ac853da-b498-4eb2-aacf-2ea6168a1205 nodeName:}" failed. No retries permitted until 2025-11-24 06:49:43.996470429 +0000 UTC m=+30.067924045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4wmxh" (UniqueName: "kubernetes.io/projected/3ac853da-b498-4eb2-aacf-2ea6168a1205-kube-api-access-4wmxh") pod "calico-apiserver-7887855f8c-x6nck" (UID: "3ac853da-b498-4eb2-aacf-2ea6168a1205") : failed to sync configmap cache: timed out waiting for the condition Nov 24 06:49:44.102341 systemd[1]: Created slice kubepods-besteffort-pod0c6abf64_6464_41f7_b11b_979ba6b72128.slice - libcontainer container kubepods-besteffort-pod0c6abf64_6464_41f7_b11b_979ba6b72128.slice. Nov 24 06:49:44.104312 containerd[1631]: time="2025-11-24T06:49:44.104286315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-n9f2t,Uid:0c6abf64-6464-41f7-b11b-979ba6b72128,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:44.141242 containerd[1631]: time="2025-11-24T06:49:44.140259786Z" level=error msg="Failed to destroy network for sandbox \"8939264c46afa40746f21d8a2ed23532e13f1124aaa54d71f8513373f7321f49\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.141545 systemd[1]: run-netns-cni\x2d9a354290\x2dea5c\x2def72\x2d4411\x2daaed5b8ecdf4.mount: Deactivated successfully. Nov 24 06:49:44.144506 containerd[1631]: time="2025-11-24T06:49:44.144466657Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-n9f2t,Uid:0c6abf64-6464-41f7-b11b-979ba6b72128,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8939264c46afa40746f21d8a2ed23532e13f1124aaa54d71f8513373f7321f49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.144702 kubelet[2928]: E1124 06:49:44.144678 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8939264c46afa40746f21d8a2ed23532e13f1124aaa54d71f8513373f7321f49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.144751 kubelet[2928]: E1124 06:49:44.144715 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8939264c46afa40746f21d8a2ed23532e13f1124aaa54d71f8513373f7321f49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-n9f2t" Nov 24 06:49:44.144751 kubelet[2928]: E1124 06:49:44.144728 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8939264c46afa40746f21d8a2ed23532e13f1124aaa54d71f8513373f7321f49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-n9f2t" Nov 24 06:49:44.144788 kubelet[2928]: E1124 06:49:44.144755 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8939264c46afa40746f21d8a2ed23532e13f1124aaa54d71f8513373f7321f49\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:44.168751 containerd[1631]: time="2025-11-24T06:49:44.168731127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bbb894f64-8954x,Uid:bc90fb97-b904-4a7f-ac89-30f34f24cf82,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:44.172285 containerd[1631]: time="2025-11-24T06:49:44.172266960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-7hjcp,Uid:f9c4aeec-8d56-49ab-910e-5dc9d27b3e29,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:49:44.175714 containerd[1631]: time="2025-11-24T06:49:44.175675957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-x6nck,Uid:3ac853da-b498-4eb2-aacf-2ea6168a1205,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:49:44.183323 containerd[1631]: time="2025-11-24T06:49:44.183268749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mxk2w,Uid:57b14474-5edf-4409-a6bd-e5a9f7dc6f4e,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:44.215163 containerd[1631]: time="2025-11-24T06:49:44.215136659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66994bd4cb-j58dd,Uid:54ddf3ce-7798-43d6-964a-ec131b6bd310,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:49:44.275271 containerd[1631]: time="2025-11-24T06:49:44.275218323Z" level=error msg="Failed to destroy network for sandbox \"48c1a59cf269a571eb639cd31ef5659d05e0a0fff1601a2d318ea6896456098e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.277879 containerd[1631]: time="2025-11-24T06:49:44.277831933Z" level=error msg="Failed to destroy network for sandbox \"f25fc1e801b45b962ac1fb46625367052b6268816229c63ea5b7717c5cb233a4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.279926 containerd[1631]: time="2025-11-24T06:49:44.279860038Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bbb894f64-8954x,Uid:bc90fb97-b904-4a7f-ac89-30f34f24cf82,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c1a59cf269a571eb639cd31ef5659d05e0a0fff1601a2d318ea6896456098e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.284305 containerd[1631]: time="2025-11-24T06:49:44.284284838Z" level=error msg="Failed to destroy network for sandbox \"b44ca77f273026c2a0e8fff1af7ec9d3a73ea58cda24a8ebfe5fd7a429468630\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.284620 containerd[1631]: time="2025-11-24T06:49:44.284598071Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-x6nck,Uid:3ac853da-b498-4eb2-aacf-2ea6168a1205,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f25fc1e801b45b962ac1fb46625367052b6268816229c63ea5b7717c5cb233a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.286240 containerd[1631]: time="2025-11-24T06:49:44.286214929Z" level=error msg="Failed to destroy network for sandbox \"2e3bf4441c1cb140ec5c785b1d2b954ca7f51bc4f502c8952288689ced848de4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.289765 containerd[1631]: time="2025-11-24T06:49:44.289746208Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mxk2w,Uid:57b14474-5edf-4409-a6bd-e5a9f7dc6f4e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b44ca77f273026c2a0e8fff1af7ec9d3a73ea58cda24a8ebfe5fd7a429468630\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.292100 containerd[1631]: time="2025-11-24T06:49:44.292079363Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-7hjcp,Uid:f9c4aeec-8d56-49ab-910e-5dc9d27b3e29,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e3bf4441c1cb140ec5c785b1d2b954ca7f51bc4f502c8952288689ced848de4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.293802 kubelet[2928]: E1124 06:49:44.293771 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c1a59cf269a571eb639cd31ef5659d05e0a0fff1601a2d318ea6896456098e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.293843 kubelet[2928]: E1124 06:49:44.293816 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c1a59cf269a571eb639cd31ef5659d05e0a0fff1601a2d318ea6896456098e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bbb894f64-8954x" Nov 24 06:49:44.294281 kubelet[2928]: E1124 06:49:44.294212 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f25fc1e801b45b962ac1fb46625367052b6268816229c63ea5b7717c5cb233a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.294281 kubelet[2928]: E1124 06:49:44.294241 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f25fc1e801b45b962ac1fb46625367052b6268816229c63ea5b7717c5cb233a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" Nov 24 06:49:44.294602 containerd[1631]: time="2025-11-24T06:49:44.294589288Z" level=error msg="Failed to destroy network for sandbox \"dbf01c2fa64eded797bf6b512777277b45207095c68c2145314409cd6410e6e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.296810 containerd[1631]: time="2025-11-24T06:49:44.296796090Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66994bd4cb-j58dd,Uid:54ddf3ce-7798-43d6-964a-ec131b6bd310,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf01c2fa64eded797bf6b512777277b45207095c68c2145314409cd6410e6e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.298872 kubelet[2928]: E1124 06:49:44.298855 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf01c2fa64eded797bf6b512777277b45207095c68c2145314409cd6410e6e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.298910 kubelet[2928]: E1124 06:49:44.298876 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf01c2fa64eded797bf6b512777277b45207095c68c2145314409cd6410e6e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" Nov 24 06:49:44.298910 kubelet[2928]: E1124 06:49:44.298887 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf01c2fa64eded797bf6b512777277b45207095c68c2145314409cd6410e6e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" Nov 24 06:49:44.306286 kubelet[2928]: E1124 06:49:44.306237 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f25fc1e801b45b962ac1fb46625367052b6268816229c63ea5b7717c5cb233a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" Nov 24 06:49:44.307075 kubelet[2928]: E1124 06:49:44.306929 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66994bd4cb-j58dd_calico-apiserver(54ddf3ce-7798-43d6-964a-ec131b6bd310)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66994bd4cb-j58dd_calico-apiserver(54ddf3ce-7798-43d6-964a-ec131b6bd310)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dbf01c2fa64eded797bf6b512777277b45207095c68c2145314409cd6410e6e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:49:44.307075 kubelet[2928]: E1124 06:49:44.306943 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c1a59cf269a571eb639cd31ef5659d05e0a0fff1601a2d318ea6896456098e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bbb894f64-8954x" Nov 24 06:49:44.307075 kubelet[2928]: E1124 06:49:44.306967 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b44ca77f273026c2a0e8fff1af7ec9d3a73ea58cda24a8ebfe5fd7a429468630\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.307839 kubelet[2928]: E1124 06:49:44.306966 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7bbb894f64-8954x_calico-system(bc90fb97-b904-4a7f-ac89-30f34f24cf82)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7bbb894f64-8954x_calico-system(bc90fb97-b904-4a7f-ac89-30f34f24cf82)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48c1a59cf269a571eb639cd31ef5659d05e0a0fff1601a2d318ea6896456098e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7bbb894f64-8954x" podUID="bc90fb97-b904-4a7f-ac89-30f34f24cf82" Nov 24 06:49:44.307839 kubelet[2928]: E1124 06:49:44.306932 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7887855f8c-x6nck_calico-apiserver(3ac853da-b498-4eb2-aacf-2ea6168a1205)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7887855f8c-x6nck_calico-apiserver(3ac853da-b498-4eb2-aacf-2ea6168a1205)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f25fc1e801b45b962ac1fb46625367052b6268816229c63ea5b7717c5cb233a4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:49:44.307839 kubelet[2928]: E1124 06:49:44.306989 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b44ca77f273026c2a0e8fff1af7ec9d3a73ea58cda24a8ebfe5fd7a429468630\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-mxk2w" Nov 24 06:49:44.307932 kubelet[2928]: E1124 06:49:44.306999 2928 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e3bf4441c1cb140ec5c785b1d2b954ca7f51bc4f502c8952288689ced848de4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:49:44.307932 kubelet[2928]: E1124 06:49:44.307000 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b44ca77f273026c2a0e8fff1af7ec9d3a73ea58cda24a8ebfe5fd7a429468630\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-mxk2w" Nov 24 06:49:44.307932 kubelet[2928]: E1124 06:49:44.307011 2928 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e3bf4441c1cb140ec5c785b1d2b954ca7f51bc4f502c8952288689ced848de4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" Nov 24 06:49:44.308017 kubelet[2928]: E1124 06:49:44.307017 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-mxk2w_calico-system(57b14474-5edf-4409-a6bd-e5a9f7dc6f4e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-mxk2w_calico-system(57b14474-5edf-4409-a6bd-e5a9f7dc6f4e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b44ca77f273026c2a0e8fff1af7ec9d3a73ea58cda24a8ebfe5fd7a429468630\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:49:44.308017 kubelet[2928]: E1124 06:49:44.307021 2928 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e3bf4441c1cb140ec5c785b1d2b954ca7f51bc4f502c8952288689ced848de4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" Nov 24 06:49:44.308017 kubelet[2928]: E1124 06:49:44.307048 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7887855f8c-7hjcp_calico-apiserver(f9c4aeec-8d56-49ab-910e-5dc9d27b3e29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7887855f8c-7hjcp_calico-apiserver(f9c4aeec-8d56-49ab-910e-5dc9d27b3e29)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e3bf4441c1cb140ec5c785b1d2b954ca7f51bc4f502c8952288689ced848de4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:49:47.534312 kubelet[2928]: I1124 06:49:47.534291 2928 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 06:49:48.449895 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2077779117.mount: Deactivated successfully. Nov 24 06:49:48.717133 containerd[1631]: time="2025-11-24T06:49:48.717032347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:48.731463 containerd[1631]: time="2025-11-24T06:49:48.731351803Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Nov 24 06:49:48.765455 containerd[1631]: time="2025-11-24T06:49:48.765358152Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:48.774600 containerd[1631]: time="2025-11-24T06:49:48.774497702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:49:48.775023 containerd[1631]: time="2025-11-24T06:49:48.774890304Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 5.538942555s" Nov 24 06:49:48.775023 containerd[1631]: time="2025-11-24T06:49:48.774912382Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Nov 24 06:49:48.930700 containerd[1631]: time="2025-11-24T06:49:48.930673435Z" level=info msg="CreateContainer within sandbox \"199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Nov 24 06:49:48.991833 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount770730629.mount: Deactivated successfully. Nov 24 06:49:49.001643 containerd[1631]: time="2025-11-24T06:49:48.991789134Z" level=info msg="Container 96f01a65efc1b490fa39e2e5ecaa314bf6f393cc1b96ba07aafb6312da9e4c12: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:49.107811 containerd[1631]: time="2025-11-24T06:49:49.107773784Z" level=info msg="CreateContainer within sandbox \"199a4435b44f96af6bc7a63566df7980ac12d9c9556f793f3ce57e6579dc6cd7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"96f01a65efc1b490fa39e2e5ecaa314bf6f393cc1b96ba07aafb6312da9e4c12\"" Nov 24 06:49:49.108282 containerd[1631]: time="2025-11-24T06:49:49.108249365Z" level=info msg="StartContainer for \"96f01a65efc1b490fa39e2e5ecaa314bf6f393cc1b96ba07aafb6312da9e4c12\"" Nov 24 06:49:49.117181 containerd[1631]: time="2025-11-24T06:49:49.117121485Z" level=info msg="connecting to shim 96f01a65efc1b490fa39e2e5ecaa314bf6f393cc1b96ba07aafb6312da9e4c12" address="unix:///run/containerd/s/0789f5073173d134575d63675fd8a0cd149e8447c02f5128267aaedbfbf8264d" protocol=ttrpc version=3 Nov 24 06:49:49.246339 systemd[1]: Started cri-containerd-96f01a65efc1b490fa39e2e5ecaa314bf6f393cc1b96ba07aafb6312da9e4c12.scope - libcontainer container 96f01a65efc1b490fa39e2e5ecaa314bf6f393cc1b96ba07aafb6312da9e4c12. Nov 24 06:49:49.308744 containerd[1631]: time="2025-11-24T06:49:49.308629704Z" level=info msg="StartContainer for \"96f01a65efc1b490fa39e2e5ecaa314bf6f393cc1b96ba07aafb6312da9e4c12\" returns successfully" Nov 24 06:49:49.386458 kubelet[2928]: I1124 06:49:49.386390 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-pc79c" podStartSLOduration=1.8961889379999999 podStartE2EDuration="18.386369895s" podCreationTimestamp="2025-11-24 06:49:31 +0000 UTC" firstStartedPulling="2025-11-24 06:49:32.285582268 +0000 UTC m=+18.357035884" lastFinishedPulling="2025-11-24 06:49:48.775763221 +0000 UTC m=+34.847216841" observedRunningTime="2025-11-24 06:49:49.385312531 +0000 UTC m=+35.456766167" watchObservedRunningTime="2025-11-24 06:49:49.386369895 +0000 UTC m=+35.457823515" Nov 24 06:49:49.800344 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Nov 24 06:49:49.805191 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Nov 24 06:49:50.051130 kubelet[2928]: I1124 06:49:50.049367 2928 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn4d4\" (UniqueName: \"kubernetes.io/projected/bc90fb97-b904-4a7f-ac89-30f34f24cf82-kube-api-access-fn4d4\") pod \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\" (UID: \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\") " Nov 24 06:49:50.051130 kubelet[2928]: I1124 06:49:50.049400 2928 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-ca-bundle\") pod \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\" (UID: \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\") " Nov 24 06:49:50.051130 kubelet[2928]: I1124 06:49:50.049419 2928 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-backend-key-pair\") pod \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\" (UID: \"bc90fb97-b904-4a7f-ac89-30f34f24cf82\") " Nov 24 06:49:50.057431 kubelet[2928]: I1124 06:49:50.057409 2928 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "bc90fb97-b904-4a7f-ac89-30f34f24cf82" (UID: "bc90fb97-b904-4a7f-ac89-30f34f24cf82"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 24 06:49:50.065468 systemd[1]: var-lib-kubelet-pods-bc90fb97\x2db904\x2d4a7f\x2dac89\x2d30f34f24cf82-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Nov 24 06:49:50.065535 systemd[1]: var-lib-kubelet-pods-bc90fb97\x2db904\x2d4a7f\x2dac89\x2d30f34f24cf82-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dfn4d4.mount: Deactivated successfully. Nov 24 06:49:50.067631 kubelet[2928]: I1124 06:49:50.067594 2928 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "bc90fb97-b904-4a7f-ac89-30f34f24cf82" (UID: "bc90fb97-b904-4a7f-ac89-30f34f24cf82"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 24 06:49:50.069784 kubelet[2928]: I1124 06:49:50.069759 2928 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc90fb97-b904-4a7f-ac89-30f34f24cf82-kube-api-access-fn4d4" (OuterVolumeSpecName: "kube-api-access-fn4d4") pod "bc90fb97-b904-4a7f-ac89-30f34f24cf82" (UID: "bc90fb97-b904-4a7f-ac89-30f34f24cf82"). InnerVolumeSpecName "kube-api-access-fn4d4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 24 06:49:50.107668 systemd[1]: Removed slice kubepods-besteffort-podbc90fb97_b904_4a7f_ac89_30f34f24cf82.slice - libcontainer container kubepods-besteffort-podbc90fb97_b904_4a7f_ac89_30f34f24cf82.slice. Nov 24 06:49:50.152295 kubelet[2928]: I1124 06:49:50.152249 2928 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fn4d4\" (UniqueName: \"kubernetes.io/projected/bc90fb97-b904-4a7f-ac89-30f34f24cf82-kube-api-access-fn4d4\") on node \"localhost\" DevicePath \"\"" Nov 24 06:49:50.152295 kubelet[2928]: I1124 06:49:50.152278 2928 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Nov 24 06:49:50.152467 kubelet[2928]: I1124 06:49:50.152284 2928 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bc90fb97-b904-4a7f-ac89-30f34f24cf82-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Nov 24 06:49:50.452334 systemd[1]: Created slice kubepods-besteffort-pod0c43478b_1cfb_4a98_8686_d4d93291e6b2.slice - libcontainer container kubepods-besteffort-pod0c43478b_1cfb_4a98_8686_d4d93291e6b2.slice. Nov 24 06:49:50.455846 kubelet[2928]: I1124 06:49:50.455562 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0c43478b-1cfb-4a98-8686-d4d93291e6b2-whisker-backend-key-pair\") pod \"whisker-bc589b8c4-9bnlt\" (UID: \"0c43478b-1cfb-4a98-8686-d4d93291e6b2\") " pod="calico-system/whisker-bc589b8c4-9bnlt" Nov 24 06:49:50.455846 kubelet[2928]: I1124 06:49:50.455601 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svvc\" (UniqueName: \"kubernetes.io/projected/0c43478b-1cfb-4a98-8686-d4d93291e6b2-kube-api-access-4svvc\") pod \"whisker-bc589b8c4-9bnlt\" (UID: \"0c43478b-1cfb-4a98-8686-d4d93291e6b2\") " pod="calico-system/whisker-bc589b8c4-9bnlt" Nov 24 06:49:50.455846 kubelet[2928]: I1124 06:49:50.455616 2928 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c43478b-1cfb-4a98-8686-d4d93291e6b2-whisker-ca-bundle\") pod \"whisker-bc589b8c4-9bnlt\" (UID: \"0c43478b-1cfb-4a98-8686-d4d93291e6b2\") " pod="calico-system/whisker-bc589b8c4-9bnlt" Nov 24 06:49:50.757457 containerd[1631]: time="2025-11-24T06:49:50.757433162Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bc589b8c4-9bnlt,Uid:0c43478b-1cfb-4a98-8686-d4d93291e6b2,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:51.069389 systemd-networkd[1508]: cali7a0e6d993ea: Link UP Nov 24 06:49:51.069507 systemd-networkd[1508]: cali7a0e6d993ea: Gained carrier Nov 24 06:49:51.078929 containerd[1631]: 2025-11-24 06:49:50.777 [INFO][4101] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Nov 24 06:49:51.078929 containerd[1631]: 2025-11-24 06:49:50.809 [INFO][4101] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--bc589b8c4--9bnlt-eth0 whisker-bc589b8c4- calico-system 0c43478b-1cfb-4a98-8686-d4d93291e6b2 926 0 2025-11-24 06:49:50 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:bc589b8c4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-bc589b8c4-9bnlt eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali7a0e6d993ea [] [] }} ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-" Nov 24 06:49:51.078929 containerd[1631]: 2025-11-24 06:49:50.809 [INFO][4101] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" Nov 24 06:49:51.078929 containerd[1631]: 2025-11-24 06:49:51.023 [INFO][4113] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" HandleID="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Workload="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.026 [INFO][4113] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" HandleID="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Workload="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003f4d60), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-bc589b8c4-9bnlt", "timestamp":"2025-11-24 06:49:51.023716741 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.026 [INFO][4113] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.026 [INFO][4113] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.027 [INFO][4113] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.036 [INFO][4113] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" host="localhost" Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.045 [INFO][4113] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.048 [INFO][4113] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.049 [INFO][4113] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.050 [INFO][4113] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:51.079813 containerd[1631]: 2025-11-24 06:49:51.050 [INFO][4113] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" host="localhost" Nov 24 06:49:51.080580 containerd[1631]: 2025-11-24 06:49:51.050 [INFO][4113] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5 Nov 24 06:49:51.080580 containerd[1631]: 2025-11-24 06:49:51.052 [INFO][4113] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" host="localhost" Nov 24 06:49:51.080580 containerd[1631]: 2025-11-24 06:49:51.055 [INFO][4113] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" host="localhost" Nov 24 06:49:51.080580 containerd[1631]: 2025-11-24 06:49:51.055 [INFO][4113] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" host="localhost" Nov 24 06:49:51.080580 containerd[1631]: 2025-11-24 06:49:51.055 [INFO][4113] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:51.080580 containerd[1631]: 2025-11-24 06:49:51.055 [INFO][4113] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" HandleID="k8s-pod-network.4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Workload="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" Nov 24 06:49:51.081206 containerd[1631]: 2025-11-24 06:49:51.056 [INFO][4101] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--bc589b8c4--9bnlt-eth0", GenerateName:"whisker-bc589b8c4-", Namespace:"calico-system", SelfLink:"", UID:"0c43478b-1cfb-4a98-8686-d4d93291e6b2", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bc589b8c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-bc589b8c4-9bnlt", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7a0e6d993ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:51.081206 containerd[1631]: 2025-11-24 06:49:51.056 [INFO][4101] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" Nov 24 06:49:51.081414 containerd[1631]: 2025-11-24 06:49:51.057 [INFO][4101] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a0e6d993ea ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" Nov 24 06:49:51.081414 containerd[1631]: 2025-11-24 06:49:51.070 [INFO][4101] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" Nov 24 06:49:51.081449 containerd[1631]: 2025-11-24 06:49:51.070 [INFO][4101] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--bc589b8c4--9bnlt-eth0", GenerateName:"whisker-bc589b8c4-", Namespace:"calico-system", SelfLink:"", UID:"0c43478b-1cfb-4a98-8686-d4d93291e6b2", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bc589b8c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5", Pod:"whisker-bc589b8c4-9bnlt", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7a0e6d993ea", MAC:"c6:75:49:ed:7e:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:51.081500 containerd[1631]: 2025-11-24 06:49:51.076 [INFO][4101] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" Namespace="calico-system" Pod="whisker-bc589b8c4-9bnlt" WorkloadEndpoint="localhost-k8s-whisker--bc589b8c4--9bnlt-eth0" Nov 24 06:49:51.104138 containerd[1631]: time="2025-11-24T06:49:51.104109206Z" level=info msg="connecting to shim 4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5" address="unix:///run/containerd/s/f072bebc10bce0ef20a519c579496080c069a0e7ca8883ebe02acf17eced386b" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:51.120326 systemd[1]: Started cri-containerd-4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5.scope - libcontainer container 4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5. Nov 24 06:49:51.131102 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:51.157953 containerd[1631]: time="2025-11-24T06:49:51.157846288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bc589b8c4-9bnlt,Uid:0c43478b-1cfb-4a98-8686-d4d93291e6b2,Namespace:calico-system,Attempt:0,} returns sandbox id \"4f65cbe32577e5597b14e7aa83f7e56115e349971aba37b2af0eeade400c43a5\"" Nov 24 06:49:51.162936 containerd[1631]: time="2025-11-24T06:49:51.162921151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:49:51.502509 containerd[1631]: time="2025-11-24T06:49:51.502470607Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:51.504499 containerd[1631]: time="2025-11-24T06:49:51.504479523Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:49:51.504533 containerd[1631]: time="2025-11-24T06:49:51.504526793Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:49:51.526810 kubelet[2928]: E1124 06:49:51.514606 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:49:51.536049 kubelet[2928]: E1124 06:49:51.532453 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:49:51.545713 kubelet[2928]: E1124 06:49:51.545663 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1dfd73395a3b493db78d35bb9e9b6696,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:51.550745 containerd[1631]: time="2025-11-24T06:49:51.550724504Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:49:51.614792 systemd-networkd[1508]: vxlan.calico: Link UP Nov 24 06:49:51.615256 systemd-networkd[1508]: vxlan.calico: Gained carrier Nov 24 06:49:51.891866 containerd[1631]: time="2025-11-24T06:49:51.891689587Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:51.892310 containerd[1631]: time="2025-11-24T06:49:51.892283386Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:49:51.892539 containerd[1631]: time="2025-11-24T06:49:51.892346951Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:49:51.892571 kubelet[2928]: E1124 06:49:51.892477 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:49:51.892571 kubelet[2928]: E1124 06:49:51.892515 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:49:51.892677 kubelet[2928]: E1124 06:49:51.892595 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:51.906016 kubelet[2928]: E1124 06:49:51.905969 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:49:52.092424 kubelet[2928]: I1124 06:49:52.092319 2928 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc90fb97-b904-4a7f-ac89-30f34f24cf82" path="/var/lib/kubelet/pods/bc90fb97-b904-4a7f-ac89-30f34f24cf82/volumes" Nov 24 06:49:52.424781 kubelet[2928]: E1124 06:49:52.424737 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:49:52.524328 systemd-networkd[1508]: cali7a0e6d993ea: Gained IPv6LL Nov 24 06:49:53.100358 systemd-networkd[1508]: vxlan.calico: Gained IPv6LL Nov 24 06:49:54.096802 containerd[1631]: time="2025-11-24T06:49:54.096775129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jqtdz,Uid:18595df3-20a1-4082-ac7b-f679e33292aa,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:54.167724 systemd-networkd[1508]: cali1dc896e9bdd: Link UP Nov 24 06:49:54.167821 systemd-networkd[1508]: cali1dc896e9bdd: Gained carrier Nov 24 06:49:54.180673 containerd[1631]: 2025-11-24 06:49:54.124 [INFO][4376] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0 coredns-668d6bf9bc- kube-system 18595df3-20a1-4082-ac7b-f679e33292aa 834 0 2025-11-24 06:49:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-jqtdz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1dc896e9bdd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-" Nov 24 06:49:54.180673 containerd[1631]: 2025-11-24 06:49:54.124 [INFO][4376] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" Nov 24 06:49:54.180673 containerd[1631]: 2025-11-24 06:49:54.145 [INFO][4388] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" HandleID="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Workload="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.145 [INFO][4388] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" HandleID="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Workload="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-jqtdz", "timestamp":"2025-11-24 06:49:54.145170762 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.145 [INFO][4388] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.145 [INFO][4388] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.145 [INFO][4388] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.150 [INFO][4388] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" host="localhost" Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.152 [INFO][4388] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.154 [INFO][4388] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.155 [INFO][4388] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.156 [INFO][4388] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:54.180849 containerd[1631]: 2025-11-24 06:49:54.156 [INFO][4388] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" host="localhost" Nov 24 06:49:54.181179 containerd[1631]: 2025-11-24 06:49:54.157 [INFO][4388] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee Nov 24 06:49:54.181179 containerd[1631]: 2025-11-24 06:49:54.159 [INFO][4388] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" host="localhost" Nov 24 06:49:54.181179 containerd[1631]: 2025-11-24 06:49:54.163 [INFO][4388] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" host="localhost" Nov 24 06:49:54.181179 containerd[1631]: 2025-11-24 06:49:54.163 [INFO][4388] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" host="localhost" Nov 24 06:49:54.181179 containerd[1631]: 2025-11-24 06:49:54.163 [INFO][4388] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:54.181179 containerd[1631]: 2025-11-24 06:49:54.163 [INFO][4388] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" HandleID="k8s-pod-network.81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Workload="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" Nov 24 06:49:54.181420 containerd[1631]: 2025-11-24 06:49:54.165 [INFO][4376] cni-plugin/k8s.go 418: Populated endpoint ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"18595df3-20a1-4082-ac7b-f679e33292aa", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-jqtdz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1dc896e9bdd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:54.181466 containerd[1631]: 2025-11-24 06:49:54.165 [INFO][4376] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" Nov 24 06:49:54.181466 containerd[1631]: 2025-11-24 06:49:54.165 [INFO][4376] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1dc896e9bdd ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" Nov 24 06:49:54.181466 containerd[1631]: 2025-11-24 06:49:54.167 [INFO][4376] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" Nov 24 06:49:54.181516 containerd[1631]: 2025-11-24 06:49:54.168 [INFO][4376] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"18595df3-20a1-4082-ac7b-f679e33292aa", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee", Pod:"coredns-668d6bf9bc-jqtdz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1dc896e9bdd", MAC:"56:02:5f:9d:02:30", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:54.181516 containerd[1631]: 2025-11-24 06:49:54.174 [INFO][4376] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" Namespace="kube-system" Pod="coredns-668d6bf9bc-jqtdz" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jqtdz-eth0" Nov 24 06:49:54.199332 containerd[1631]: time="2025-11-24T06:49:54.199297203Z" level=info msg="connecting to shim 81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee" address="unix:///run/containerd/s/8aba8b6485d482742743b45ec789eaaaa4ce08705f232b9835d3a8dbbc08ed75" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:54.218333 systemd[1]: Started cri-containerd-81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee.scope - libcontainer container 81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee. Nov 24 06:49:54.227248 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:54.255087 containerd[1631]: time="2025-11-24T06:49:54.255057868Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jqtdz,Uid:18595df3-20a1-4082-ac7b-f679e33292aa,Namespace:kube-system,Attempt:0,} returns sandbox id \"81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee\"" Nov 24 06:49:54.258658 containerd[1631]: time="2025-11-24T06:49:54.258642909Z" level=info msg="CreateContainer within sandbox \"81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Nov 24 06:49:54.273059 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2466634321.mount: Deactivated successfully. Nov 24 06:49:54.274278 containerd[1631]: time="2025-11-24T06:49:54.273855709Z" level=info msg="Container 0ad1515a5068cdaddb9fc1b6f969425eae6233a3aa9013f448aadd03a2822465: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:54.276988 containerd[1631]: time="2025-11-24T06:49:54.276966931Z" level=info msg="CreateContainer within sandbox \"81ce9f6d41a85ba6d5aa4e8c107d57d00e629858893a3569eb642783c8f91fee\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0ad1515a5068cdaddb9fc1b6f969425eae6233a3aa9013f448aadd03a2822465\"" Nov 24 06:49:54.277969 containerd[1631]: time="2025-11-24T06:49:54.277397035Z" level=info msg="StartContainer for \"0ad1515a5068cdaddb9fc1b6f969425eae6233a3aa9013f448aadd03a2822465\"" Nov 24 06:49:54.277969 containerd[1631]: time="2025-11-24T06:49:54.277831202Z" level=info msg="connecting to shim 0ad1515a5068cdaddb9fc1b6f969425eae6233a3aa9013f448aadd03a2822465" address="unix:///run/containerd/s/8aba8b6485d482742743b45ec789eaaaa4ce08705f232b9835d3a8dbbc08ed75" protocol=ttrpc version=3 Nov 24 06:49:54.299345 systemd[1]: Started cri-containerd-0ad1515a5068cdaddb9fc1b6f969425eae6233a3aa9013f448aadd03a2822465.scope - libcontainer container 0ad1515a5068cdaddb9fc1b6f969425eae6233a3aa9013f448aadd03a2822465. Nov 24 06:49:54.320209 containerd[1631]: time="2025-11-24T06:49:54.320183863Z" level=info msg="StartContainer for \"0ad1515a5068cdaddb9fc1b6f969425eae6233a3aa9013f448aadd03a2822465\" returns successfully" Nov 24 06:49:55.091429 containerd[1631]: time="2025-11-24T06:49:55.091394727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-n9f2t,Uid:0c6abf64-6464-41f7-b11b-979ba6b72128,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:55.166717 systemd-networkd[1508]: cali5eb83fa982d: Link UP Nov 24 06:49:55.167510 systemd-networkd[1508]: cali5eb83fa982d: Gained carrier Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.118 [INFO][4485] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--n9f2t-eth0 csi-node-driver- calico-system 0c6abf64-6464-41f7-b11b-979ba6b72128 733 0 2025-11-24 06:49:32 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-n9f2t eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5eb83fa982d [] [] }} ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.118 [INFO][4485] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-eth0" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.142 [INFO][4496] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" HandleID="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Workload="localhost-k8s-csi--node--driver--n9f2t-eth0" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.142 [INFO][4496] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" HandleID="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Workload="localhost-k8s-csi--node--driver--n9f2t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c56d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-n9f2t", "timestamp":"2025-11-24 06:49:55.142295092 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.142 [INFO][4496] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.142 [INFO][4496] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.142 [INFO][4496] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.147 [INFO][4496] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.150 [INFO][4496] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.152 [INFO][4496] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.153 [INFO][4496] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.155 [INFO][4496] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.155 [INFO][4496] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.155 [INFO][4496] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.157 [INFO][4496] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.160 [INFO][4496] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.160 [INFO][4496] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" host="localhost" Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.160 [INFO][4496] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:55.181218 containerd[1631]: 2025-11-24 06:49:55.160 [INFO][4496] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" HandleID="k8s-pod-network.fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Workload="localhost-k8s-csi--node--driver--n9f2t-eth0" Nov 24 06:49:55.183503 containerd[1631]: 2025-11-24 06:49:55.164 [INFO][4485] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--n9f2t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0c6abf64-6464-41f7-b11b-979ba6b72128", ResourceVersion:"733", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-n9f2t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5eb83fa982d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:55.183503 containerd[1631]: 2025-11-24 06:49:55.164 [INFO][4485] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-eth0" Nov 24 06:49:55.183503 containerd[1631]: 2025-11-24 06:49:55.164 [INFO][4485] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5eb83fa982d ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-eth0" Nov 24 06:49:55.183503 containerd[1631]: 2025-11-24 06:49:55.167 [INFO][4485] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-eth0" Nov 24 06:49:55.183503 containerd[1631]: 2025-11-24 06:49:55.168 [INFO][4485] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--n9f2t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0c6abf64-6464-41f7-b11b-979ba6b72128", ResourceVersion:"733", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b", Pod:"csi-node-driver-n9f2t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5eb83fa982d", MAC:"fa:b4:c6:2c:c5:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:55.183503 containerd[1631]: 2025-11-24 06:49:55.174 [INFO][4485] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" Namespace="calico-system" Pod="csi-node-driver-n9f2t" WorkloadEndpoint="localhost-k8s-csi--node--driver--n9f2t-eth0" Nov 24 06:49:55.224730 containerd[1631]: time="2025-11-24T06:49:55.224691893Z" level=info msg="connecting to shim fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b" address="unix:///run/containerd/s/d8d3b3a072566bad5afb481b663302e3237347e0c2d6160b4789c44f70797edf" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:55.250491 systemd[1]: Started cri-containerd-fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b.scope - libcontainer container fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b. Nov 24 06:49:55.270056 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:55.271962 kubelet[2928]: I1124 06:49:55.246922 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-jqtdz" podStartSLOduration=36.180894824 podStartE2EDuration="36.180894824s" podCreationTimestamp="2025-11-24 06:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:49:54.435731929 +0000 UTC m=+40.507185549" watchObservedRunningTime="2025-11-24 06:49:55.180894824 +0000 UTC m=+41.252348444" Nov 24 06:49:55.287679 containerd[1631]: time="2025-11-24T06:49:55.287652926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-n9f2t,Uid:0c6abf64-6464-41f7-b11b-979ba6b72128,Namespace:calico-system,Attempt:0,} returns sandbox id \"fd6c3a1e43c7c300c6bf656994e0b6e51cca3875b6d4ee13dccfb213bc7a391b\"" Nov 24 06:49:55.296577 containerd[1631]: time="2025-11-24T06:49:55.296553185Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:49:55.614050 containerd[1631]: time="2025-11-24T06:49:55.613963735Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:55.614280 containerd[1631]: time="2025-11-24T06:49:55.614262284Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:49:55.614661 containerd[1631]: time="2025-11-24T06:49:55.614305451Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:49:55.614689 kubelet[2928]: E1124 06:49:55.614388 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:49:55.614689 kubelet[2928]: E1124 06:49:55.614429 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:49:55.614689 kubelet[2928]: E1124 06:49:55.614509 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:55.616547 containerd[1631]: time="2025-11-24T06:49:55.616529220Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:49:55.852340 systemd-networkd[1508]: cali1dc896e9bdd: Gained IPv6LL Nov 24 06:49:55.992130 containerd[1631]: time="2025-11-24T06:49:55.992099804Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:55.992416 containerd[1631]: time="2025-11-24T06:49:55.992396130Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:49:55.992604 kubelet[2928]: E1124 06:49:55.992572 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:49:55.992655 kubelet[2928]: E1124 06:49:55.992604 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:49:55.993139 kubelet[2928]: E1124 06:49:55.992673 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:55.993953 kubelet[2928]: E1124 06:49:55.993929 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:56.002246 containerd[1631]: time="2025-11-24T06:49:55.992440467Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:49:56.091841 containerd[1631]: time="2025-11-24T06:49:56.091811698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mxk2w,Uid:57b14474-5edf-4409-a6bd-e5a9f7dc6f4e,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:56.092150 containerd[1631]: time="2025-11-24T06:49:56.092137907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-7hjcp,Uid:f9c4aeec-8d56-49ab-910e-5dc9d27b3e29,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:49:56.172589 systemd-networkd[1508]: cali8248adb4e0f: Link UP Nov 24 06:49:56.173110 systemd-networkd[1508]: cali8248adb4e0f: Gained carrier Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.126 [INFO][4558] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--666569f655--mxk2w-eth0 goldmane-666569f655- calico-system 57b14474-5edf-4409-a6bd-e5a9f7dc6f4e 841 0 2025-11-24 06:49:29 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-666569f655-mxk2w eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali8248adb4e0f [] [] }} ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.126 [INFO][4558] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-eth0" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.146 [INFO][4584] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" HandleID="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Workload="localhost-k8s-goldmane--666569f655--mxk2w-eth0" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.146 [INFO][4584] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" HandleID="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Workload="localhost-k8s-goldmane--666569f655--mxk2w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-666569f655-mxk2w", "timestamp":"2025-11-24 06:49:56.146058257 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.146 [INFO][4584] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.146 [INFO][4584] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.146 [INFO][4584] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.152 [INFO][4584] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.156 [INFO][4584] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.160 [INFO][4584] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.161 [INFO][4584] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.162 [INFO][4584] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.162 [INFO][4584] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.163 [INFO][4584] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.165 [INFO][4584] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.167 [INFO][4584] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.167 [INFO][4584] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" host="localhost" Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.167 [INFO][4584] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:56.184921 containerd[1631]: 2025-11-24 06:49:56.167 [INFO][4584] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" HandleID="k8s-pod-network.244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Workload="localhost-k8s-goldmane--666569f655--mxk2w-eth0" Nov 24 06:49:56.186215 containerd[1631]: 2025-11-24 06:49:56.170 [INFO][4558] cni-plugin/k8s.go 418: Populated endpoint ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--mxk2w-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"57b14474-5edf-4409-a6bd-e5a9f7dc6f4e", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-666569f655-mxk2w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8248adb4e0f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:56.186215 containerd[1631]: 2025-11-24 06:49:56.170 [INFO][4558] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-eth0" Nov 24 06:49:56.186215 containerd[1631]: 2025-11-24 06:49:56.170 [INFO][4558] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8248adb4e0f ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-eth0" Nov 24 06:49:56.186215 containerd[1631]: 2025-11-24 06:49:56.172 [INFO][4558] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-eth0" Nov 24 06:49:56.186215 containerd[1631]: 2025-11-24 06:49:56.172 [INFO][4558] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--mxk2w-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"57b14474-5edf-4409-a6bd-e5a9f7dc6f4e", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de", Pod:"goldmane-666569f655-mxk2w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8248adb4e0f", MAC:"c6:ff:75:70:ea:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:56.186215 containerd[1631]: 2025-11-24 06:49:56.182 [INFO][4558] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" Namespace="calico-system" Pod="goldmane-666569f655-mxk2w" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--mxk2w-eth0" Nov 24 06:49:56.200654 containerd[1631]: time="2025-11-24T06:49:56.200627089Z" level=info msg="connecting to shim 244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de" address="unix:///run/containerd/s/c877f41118e15afef36b6d8a3868b7f151f72dfae4e0dd719453b26398e40936" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:56.220338 systemd[1]: Started cri-containerd-244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de.scope - libcontainer container 244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de. Nov 24 06:49:56.229070 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:56.256384 containerd[1631]: time="2025-11-24T06:49:56.256218212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-mxk2w,Uid:57b14474-5edf-4409-a6bd-e5a9f7dc6f4e,Namespace:calico-system,Attempt:0,} returns sandbox id \"244a8ff659dab55fc5ae1996328602532699abe0cbe01fbe49719c16d80539de\"" Nov 24 06:49:56.258574 containerd[1631]: time="2025-11-24T06:49:56.258187094Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:49:56.274582 systemd-networkd[1508]: caliea2c4b55f1d: Link UP Nov 24 06:49:56.275052 systemd-networkd[1508]: caliea2c4b55f1d: Gained carrier Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.126 [INFO][4567] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0 calico-apiserver-7887855f8c- calico-apiserver f9c4aeec-8d56-49ab-910e-5dc9d27b3e29 840 0 2025-11-24 06:49:28 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7887855f8c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7887855f8c-7hjcp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliea2c4b55f1d [] [] }} ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.126 [INFO][4567] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.155 [INFO][4586] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" HandleID="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Workload="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.156 [INFO][4586] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" HandleID="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Workload="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00025af90), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7887855f8c-7hjcp", "timestamp":"2025-11-24 06:49:56.155985107 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.156 [INFO][4586] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.167 [INFO][4586] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.168 [INFO][4586] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.254 [INFO][4586] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.258 [INFO][4586] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.261 [INFO][4586] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.262 [INFO][4586] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.263 [INFO][4586] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.263 [INFO][4586] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.264 [INFO][4586] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75 Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.267 [INFO][4586] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.269 [INFO][4586] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.269 [INFO][4586] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" host="localhost" Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.269 [INFO][4586] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:56.289420 containerd[1631]: 2025-11-24 06:49:56.269 [INFO][4586] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" HandleID="k8s-pod-network.7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Workload="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" Nov 24 06:49:56.290480 containerd[1631]: 2025-11-24 06:49:56.271 [INFO][4567] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0", GenerateName:"calico-apiserver-7887855f8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"f9c4aeec-8d56-49ab-910e-5dc9d27b3e29", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7887855f8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7887855f8c-7hjcp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliea2c4b55f1d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:56.290480 containerd[1631]: 2025-11-24 06:49:56.272 [INFO][4567] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" Nov 24 06:49:56.290480 containerd[1631]: 2025-11-24 06:49:56.272 [INFO][4567] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliea2c4b55f1d ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" Nov 24 06:49:56.290480 containerd[1631]: 2025-11-24 06:49:56.275 [INFO][4567] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" Nov 24 06:49:56.290480 containerd[1631]: 2025-11-24 06:49:56.275 [INFO][4567] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0", GenerateName:"calico-apiserver-7887855f8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"f9c4aeec-8d56-49ab-910e-5dc9d27b3e29", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7887855f8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75", Pod:"calico-apiserver-7887855f8c-7hjcp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliea2c4b55f1d", MAC:"1e:cd:0f:cb:28:fe", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:56.290480 containerd[1631]: 2025-11-24 06:49:56.286 [INFO][4567] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-7hjcp" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--7hjcp-eth0" Nov 24 06:49:56.303596 containerd[1631]: time="2025-11-24T06:49:56.303565424Z" level=info msg="connecting to shim 7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75" address="unix:///run/containerd/s/47b2ca2c463241672c9b65d210d4928a130bebb02c26fbbcf52cb6e27da86667" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:56.326324 systemd[1]: Started cri-containerd-7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75.scope - libcontainer container 7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75. Nov 24 06:49:56.334494 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:56.358561 containerd[1631]: time="2025-11-24T06:49:56.358492331Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-7hjcp,Uid:f9c4aeec-8d56-49ab-910e-5dc9d27b3e29,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7ad59a646c83e2c7f2813c3790f2e9d6d2c4c6e593bda3a0a1c26255eebfae75\"" Nov 24 06:49:56.442754 kubelet[2928]: E1124 06:49:56.442721 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:49:56.597145 containerd[1631]: time="2025-11-24T06:49:56.597064047Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:56.597554 containerd[1631]: time="2025-11-24T06:49:56.597488904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:49:56.597554 containerd[1631]: time="2025-11-24T06:49:56.597537858Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:49:56.597656 kubelet[2928]: E1124 06:49:56.597632 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:49:56.597699 kubelet[2928]: E1124 06:49:56.597662 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:49:56.598103 kubelet[2928]: E1124 06:49:56.597835 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kd8l6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mxk2w_calico-system(57b14474-5edf-4409-a6bd-e5a9f7dc6f4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:56.598374 containerd[1631]: time="2025-11-24T06:49:56.598308487Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:49:56.599404 kubelet[2928]: E1124 06:49:56.599374 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:49:56.876610 systemd-networkd[1508]: cali5eb83fa982d: Gained IPv6LL Nov 24 06:49:56.964908 containerd[1631]: time="2025-11-24T06:49:56.964877325Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:56.965328 containerd[1631]: time="2025-11-24T06:49:56.965307175Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:49:56.965372 containerd[1631]: time="2025-11-24T06:49:56.965354769Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:49:56.965484 kubelet[2928]: E1124 06:49:56.965454 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:49:56.965524 kubelet[2928]: E1124 06:49:56.965486 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:49:56.965592 kubelet[2928]: E1124 06:49:56.965564 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlhhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-7hjcp_calico-apiserver(f9c4aeec-8d56-49ab-910e-5dc9d27b3e29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:56.966744 kubelet[2928]: E1124 06:49:56.966726 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:49:57.091732 containerd[1631]: time="2025-11-24T06:49:57.091703199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cdc8946dd-7dnrd,Uid:a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68,Namespace:calico-system,Attempt:0,}" Nov 24 06:49:57.155822 systemd-networkd[1508]: cali136890f194b: Link UP Nov 24 06:49:57.157113 systemd-networkd[1508]: cali136890f194b: Gained carrier Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.122 [INFO][4715] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0 calico-kube-controllers-7cdc8946dd- calico-system a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68 837 0 2025-11-24 06:49:32 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7cdc8946dd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7cdc8946dd-7dnrd eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali136890f194b [] [] }} ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.122 [INFO][4715] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.135 [INFO][4727] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" HandleID="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Workload="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.135 [INFO][4727] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" HandleID="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Workload="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f8e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7cdc8946dd-7dnrd", "timestamp":"2025-11-24 06:49:57.135760487 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.135 [INFO][4727] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.135 [INFO][4727] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.136 [INFO][4727] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.139 [INFO][4727] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.142 [INFO][4727] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.143 [INFO][4727] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.144 [INFO][4727] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.145 [INFO][4727] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.145 [INFO][4727] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.146 [INFO][4727] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91 Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.148 [INFO][4727] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.151 [INFO][4727] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.151 [INFO][4727] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" host="localhost" Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.151 [INFO][4727] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:57.168266 containerd[1631]: 2025-11-24 06:49:57.151 [INFO][4727] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" HandleID="k8s-pod-network.e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Workload="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" Nov 24 06:49:57.169467 containerd[1631]: 2025-11-24 06:49:57.154 [INFO][4715] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0", GenerateName:"calico-kube-controllers-7cdc8946dd-", Namespace:"calico-system", SelfLink:"", UID:"a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cdc8946dd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7cdc8946dd-7dnrd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali136890f194b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:57.169467 containerd[1631]: 2025-11-24 06:49:57.154 [INFO][4715] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" Nov 24 06:49:57.169467 containerd[1631]: 2025-11-24 06:49:57.154 [INFO][4715] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali136890f194b ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" Nov 24 06:49:57.169467 containerd[1631]: 2025-11-24 06:49:57.157 [INFO][4715] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" Nov 24 06:49:57.169467 containerd[1631]: 2025-11-24 06:49:57.157 [INFO][4715] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0", GenerateName:"calico-kube-controllers-7cdc8946dd-", Namespace:"calico-system", SelfLink:"", UID:"a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cdc8946dd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91", Pod:"calico-kube-controllers-7cdc8946dd-7dnrd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali136890f194b", MAC:"76:37:ee:cd:7b:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:57.169467 containerd[1631]: 2025-11-24 06:49:57.165 [INFO][4715] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" Namespace="calico-system" Pod="calico-kube-controllers-7cdc8946dd-7dnrd" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cdc8946dd--7dnrd-eth0" Nov 24 06:49:57.179138 containerd[1631]: time="2025-11-24T06:49:57.179108667Z" level=info msg="connecting to shim e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91" address="unix:///run/containerd/s/afcfb92578d4098f594a1a68bf34e51179518f17550f1b6dfd49cbbba86548e0" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:57.196325 systemd[1]: Started cri-containerd-e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91.scope - libcontainer container e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91. Nov 24 06:49:57.207904 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:57.234311 containerd[1631]: time="2025-11-24T06:49:57.234277812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cdc8946dd-7dnrd,Uid:a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68,Namespace:calico-system,Attempt:0,} returns sandbox id \"e8d8c9340a41858c5af1d784dac69da88fd4813eca07cbf0da0e05ffd47d5d91\"" Nov 24 06:49:57.235727 containerd[1631]: time="2025-11-24T06:49:57.235632443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:49:57.448285 kubelet[2928]: E1124 06:49:57.448148 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:49:57.448285 kubelet[2928]: E1124 06:49:57.448203 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:49:57.558325 containerd[1631]: time="2025-11-24T06:49:57.558213883Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:57.558744 containerd[1631]: time="2025-11-24T06:49:57.558727399Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Nov 24 06:49:57.558833 containerd[1631]: time="2025-11-24T06:49:57.558768246Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Nov 24 06:49:57.558971 kubelet[2928]: E1124 06:49:57.558945 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:49:57.559676 kubelet[2928]: E1124 06:49:57.558976 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:49:57.559676 kubelet[2928]: E1124 06:49:57.559301 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqgvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7cdc8946dd-7dnrd_calico-system(a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:57.560437 kubelet[2928]: E1124 06:49:57.560415 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:49:57.644336 systemd-networkd[1508]: caliea2c4b55f1d: Gained IPv6LL Nov 24 06:49:58.092041 containerd[1631]: time="2025-11-24T06:49:58.091841158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jmbwk,Uid:8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9,Namespace:kube-system,Attempt:0,}" Nov 24 06:49:58.092580 systemd-networkd[1508]: cali8248adb4e0f: Gained IPv6LL Nov 24 06:49:58.092916 containerd[1631]: time="2025-11-24T06:49:58.092900096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66994bd4cb-j58dd,Uid:54ddf3ce-7798-43d6-964a-ec131b6bd310,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:49:58.179549 systemd-networkd[1508]: cali31b21cf93b9: Link UP Nov 24 06:49:58.179657 systemd-networkd[1508]: cali31b21cf93b9: Gained carrier Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.125 [INFO][4791] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0 calico-apiserver-66994bd4cb- calico-apiserver 54ddf3ce-7798-43d6-964a-ec131b6bd310 839 0 2025-11-24 06:49:28 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:66994bd4cb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-66994bd4cb-j58dd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali31b21cf93b9 [] [] }} ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.125 [INFO][4791] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.150 [INFO][4815] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" HandleID="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Workload="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.151 [INFO][4815] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" HandleID="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Workload="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5110), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-66994bd4cb-j58dd", "timestamp":"2025-11-24 06:49:58.150552539 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.151 [INFO][4815] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.151 [INFO][4815] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.151 [INFO][4815] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.160 [INFO][4815] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.163 [INFO][4815] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.164 [INFO][4815] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.165 [INFO][4815] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.166 [INFO][4815] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.166 [INFO][4815] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.167 [INFO][4815] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2 Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.169 [INFO][4815] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.172 [INFO][4815] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.172 [INFO][4815] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" host="localhost" Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.172 [INFO][4815] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:58.189955 containerd[1631]: 2025-11-24 06:49:58.172 [INFO][4815] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" HandleID="k8s-pod-network.f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Workload="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" Nov 24 06:49:58.190747 containerd[1631]: 2025-11-24 06:49:58.174 [INFO][4791] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0", GenerateName:"calico-apiserver-66994bd4cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"54ddf3ce-7798-43d6-964a-ec131b6bd310", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66994bd4cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-66994bd4cb-j58dd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali31b21cf93b9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:58.190747 containerd[1631]: 2025-11-24 06:49:58.174 [INFO][4791] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" Nov 24 06:49:58.190747 containerd[1631]: 2025-11-24 06:49:58.174 [INFO][4791] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali31b21cf93b9 ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" Nov 24 06:49:58.190747 containerd[1631]: 2025-11-24 06:49:58.179 [INFO][4791] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" Nov 24 06:49:58.190747 containerd[1631]: 2025-11-24 06:49:58.179 [INFO][4791] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0", GenerateName:"calico-apiserver-66994bd4cb-", Namespace:"calico-apiserver", SelfLink:"", UID:"54ddf3ce-7798-43d6-964a-ec131b6bd310", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66994bd4cb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2", Pod:"calico-apiserver-66994bd4cb-j58dd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali31b21cf93b9", MAC:"42:99:bd:64:95:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:58.190747 containerd[1631]: 2025-11-24 06:49:58.187 [INFO][4791] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" Namespace="calico-apiserver" Pod="calico-apiserver-66994bd4cb-j58dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--66994bd4cb--j58dd-eth0" Nov 24 06:49:58.305989 containerd[1631]: time="2025-11-24T06:49:58.305469968Z" level=info msg="connecting to shim f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2" address="unix:///run/containerd/s/69d2fa15cf2b68a154c4f67f6595c81f96af18920b44e35928ac1f1171c2acb3" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:58.323528 systemd-networkd[1508]: cali5a6d345dc66: Link UP Nov 24 06:49:58.323928 systemd-networkd[1508]: cali5a6d345dc66: Gained carrier Nov 24 06:49:58.328276 systemd[1]: Started cri-containerd-f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2.scope - libcontainer container f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2. Nov 24 06:49:58.337869 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.131 [INFO][4790] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0 coredns-668d6bf9bc- kube-system 8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9 842 0 2025-11-24 06:49:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-jmbwk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5a6d345dc66 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.131 [INFO][4790] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.156 [INFO][4820] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" HandleID="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Workload="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.157 [INFO][4820] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" HandleID="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Workload="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00048efa0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-jmbwk", "timestamp":"2025-11-24 06:49:58.156904958 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.157 [INFO][4820] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.173 [INFO][4820] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.173 [INFO][4820] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.261 [INFO][4820] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.264 [INFO][4820] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.284 [INFO][4820] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.286 [INFO][4820] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.289 [INFO][4820] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.289 [INFO][4820] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.294 [INFO][4820] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.305 [INFO][4820] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.315 [INFO][4820] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.316 [INFO][4820] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" host="localhost" Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.316 [INFO][4820] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:58.342345 containerd[1631]: 2025-11-24 06:49:58.316 [INFO][4820] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" HandleID="k8s-pod-network.6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Workload="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" Nov 24 06:49:58.343747 containerd[1631]: 2025-11-24 06:49:58.318 [INFO][4790] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-jmbwk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5a6d345dc66", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:58.343747 containerd[1631]: 2025-11-24 06:49:58.318 [INFO][4790] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" Nov 24 06:49:58.343747 containerd[1631]: 2025-11-24 06:49:58.318 [INFO][4790] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5a6d345dc66 ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" Nov 24 06:49:58.343747 containerd[1631]: 2025-11-24 06:49:58.324 [INFO][4790] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" Nov 24 06:49:58.343747 containerd[1631]: 2025-11-24 06:49:58.324 [INFO][4790] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df", Pod:"coredns-668d6bf9bc-jmbwk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5a6d345dc66", MAC:"ce:f3:70:01:6a:20", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:58.343747 containerd[1631]: 2025-11-24 06:49:58.339 [INFO][4790] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" Namespace="kube-system" Pod="coredns-668d6bf9bc-jmbwk" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--jmbwk-eth0" Nov 24 06:49:58.374766 containerd[1631]: time="2025-11-24T06:49:58.374746412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66994bd4cb-j58dd,Uid:54ddf3ce-7798-43d6-964a-ec131b6bd310,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f323f7f1a9ffd12622ad529418c56a14e4d867b3a28cba09f710bb799bf80eb2\"" Nov 24 06:49:58.375985 containerd[1631]: time="2025-11-24T06:49:58.375885910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:49:58.418586 containerd[1631]: time="2025-11-24T06:49:58.418555938Z" level=info msg="connecting to shim 6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df" address="unix:///run/containerd/s/8fea1329b7235af56bd6d58a2b3ab6a9ca69bc94688c1c1a6b46e6edf1352b10" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:58.438459 systemd[1]: Started cri-containerd-6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df.scope - libcontainer container 6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df. Nov 24 06:49:58.450910 kubelet[2928]: E1124 06:49:58.450771 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:49:58.452346 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:58.483839 containerd[1631]: time="2025-11-24T06:49:58.483817530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jmbwk,Uid:8d5ea9ae-70a1-4d7a-a9c6-6b6b766f4ac9,Namespace:kube-system,Attempt:0,} returns sandbox id \"6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df\"" Nov 24 06:49:58.498964 containerd[1631]: time="2025-11-24T06:49:58.498831894Z" level=info msg="CreateContainer within sandbox \"6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Nov 24 06:49:58.555975 containerd[1631]: time="2025-11-24T06:49:58.555934825Z" level=info msg="Container 7344e6f5906601317c11a16aea84821b6a4395edc16052e47fd27554a5ff6e9f: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:49:58.590839 containerd[1631]: time="2025-11-24T06:49:58.590755823Z" level=info msg="CreateContainer within sandbox \"6d0ff6f18aadc1bdff1678d5ae62ae8d0d1604f31212e8bda551450da56df2df\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7344e6f5906601317c11a16aea84821b6a4395edc16052e47fd27554a5ff6e9f\"" Nov 24 06:49:58.591834 containerd[1631]: time="2025-11-24T06:49:58.591814681Z" level=info msg="StartContainer for \"7344e6f5906601317c11a16aea84821b6a4395edc16052e47fd27554a5ff6e9f\"" Nov 24 06:49:58.592606 containerd[1631]: time="2025-11-24T06:49:58.592548897Z" level=info msg="connecting to shim 7344e6f5906601317c11a16aea84821b6a4395edc16052e47fd27554a5ff6e9f" address="unix:///run/containerd/s/8fea1329b7235af56bd6d58a2b3ab6a9ca69bc94688c1c1a6b46e6edf1352b10" protocol=ttrpc version=3 Nov 24 06:49:58.610322 systemd[1]: Started cri-containerd-7344e6f5906601317c11a16aea84821b6a4395edc16052e47fd27554a5ff6e9f.scope - libcontainer container 7344e6f5906601317c11a16aea84821b6a4395edc16052e47fd27554a5ff6e9f. Nov 24 06:49:58.638696 containerd[1631]: time="2025-11-24T06:49:58.638676944Z" level=info msg="StartContainer for \"7344e6f5906601317c11a16aea84821b6a4395edc16052e47fd27554a5ff6e9f\" returns successfully" Nov 24 06:49:58.702787 containerd[1631]: time="2025-11-24T06:49:58.702344356Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:58.707368 containerd[1631]: time="2025-11-24T06:49:58.707347679Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:49:58.707417 containerd[1631]: time="2025-11-24T06:49:58.707394692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:49:58.707499 kubelet[2928]: E1124 06:49:58.707470 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:49:58.707534 kubelet[2928]: E1124 06:49:58.707504 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:49:58.708240 kubelet[2928]: E1124 06:49:58.707575 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jrzb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-66994bd4cb-j58dd_calico-apiserver(54ddf3ce-7798-43d6-964a-ec131b6bd310): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:58.708824 kubelet[2928]: E1124 06:49:58.708791 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:49:59.091497 containerd[1631]: time="2025-11-24T06:49:59.091469581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-x6nck,Uid:3ac853da-b498-4eb2-aacf-2ea6168a1205,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:49:59.174364 systemd-networkd[1508]: calib7b8f1214be: Link UP Nov 24 06:49:59.175010 systemd-networkd[1508]: calib7b8f1214be: Gained carrier Nov 24 06:49:59.180357 systemd-networkd[1508]: cali136890f194b: Gained IPv6LL Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.120 [INFO][4973] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0 calico-apiserver-7887855f8c- calico-apiserver 3ac853da-b498-4eb2-aacf-2ea6168a1205 838 0 2025-11-24 06:49:28 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7887855f8c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7887855f8c-x6nck eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib7b8f1214be [] [] }} ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.120 [INFO][4973] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.148 [INFO][4984] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" HandleID="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Workload="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.149 [INFO][4984] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" HandleID="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Workload="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f120), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7887855f8c-x6nck", "timestamp":"2025-11-24 06:49:59.14892021 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.149 [INFO][4984] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.149 [INFO][4984] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.149 [INFO][4984] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.154 [INFO][4984] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.157 [INFO][4984] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.159 [INFO][4984] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.160 [INFO][4984] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.161 [INFO][4984] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.161 [INFO][4984] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.162 [INFO][4984] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6 Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.164 [INFO][4984] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.168 [INFO][4984] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.168 [INFO][4984] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" host="localhost" Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.168 [INFO][4984] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:49:59.189154 containerd[1631]: 2025-11-24 06:49:59.168 [INFO][4984] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" HandleID="k8s-pod-network.0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Workload="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" Nov 24 06:49:59.191596 containerd[1631]: 2025-11-24 06:49:59.171 [INFO][4973] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0", GenerateName:"calico-apiserver-7887855f8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ac853da-b498-4eb2-aacf-2ea6168a1205", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7887855f8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7887855f8c-x6nck", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib7b8f1214be", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:59.191596 containerd[1631]: 2025-11-24 06:49:59.171 [INFO][4973] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" Nov 24 06:49:59.191596 containerd[1631]: 2025-11-24 06:49:59.171 [INFO][4973] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib7b8f1214be ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" Nov 24 06:49:59.191596 containerd[1631]: 2025-11-24 06:49:59.175 [INFO][4973] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" Nov 24 06:49:59.191596 containerd[1631]: 2025-11-24 06:49:59.175 [INFO][4973] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0", GenerateName:"calico-apiserver-7887855f8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ac853da-b498-4eb2-aacf-2ea6168a1205", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7887855f8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6", Pod:"calico-apiserver-7887855f8c-x6nck", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib7b8f1214be", MAC:"9a:27:39:99:d6:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:49:59.191596 containerd[1631]: 2025-11-24 06:49:59.186 [INFO][4973] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" Namespace="calico-apiserver" Pod="calico-apiserver-7887855f8c-x6nck" WorkloadEndpoint="localhost-k8s-calico--apiserver--7887855f8c--x6nck-eth0" Nov 24 06:49:59.207329 containerd[1631]: time="2025-11-24T06:49:59.207297922Z" level=info msg="connecting to shim 0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6" address="unix:///run/containerd/s/21d77a5752774a9ee2c530862077b30e37275224a23d53944e2c461c92941d58" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:49:59.225459 systemd[1]: Started cri-containerd-0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6.scope - libcontainer container 0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6. Nov 24 06:49:59.234526 systemd-resolved[1510]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:49:59.261269 containerd[1631]: time="2025-11-24T06:49:59.261243596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7887855f8c-x6nck,Uid:3ac853da-b498-4eb2-aacf-2ea6168a1205,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0c71cb1df5ffb54912bb1bfa393a70e598e8e1ecd7376f53ca0b9349bba84dd6\"" Nov 24 06:49:59.262420 containerd[1631]: time="2025-11-24T06:49:59.262401430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:49:59.454813 kubelet[2928]: E1124 06:49:59.454518 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:49:59.478589 kubelet[2928]: I1124 06:49:59.478550 2928 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-jmbwk" podStartSLOduration=40.478536825 podStartE2EDuration="40.478536825s" podCreationTimestamp="2025-11-24 06:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:49:59.477890938 +0000 UTC m=+45.549344563" watchObservedRunningTime="2025-11-24 06:49:59.478536825 +0000 UTC m=+45.549990451" Nov 24 06:49:59.618348 containerd[1631]: time="2025-11-24T06:49:59.618313814Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:49:59.630353 containerd[1631]: time="2025-11-24T06:49:59.630273209Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:49:59.630353 containerd[1631]: time="2025-11-24T06:49:59.630334401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:49:59.630544 kubelet[2928]: E1124 06:49:59.630518 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:49:59.630664 kubelet[2928]: E1124 06:49:59.630602 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:49:59.630761 kubelet[2928]: E1124 06:49:59.630733 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wmxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-x6nck_calico-apiserver(3ac853da-b498-4eb2-aacf-2ea6168a1205): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:49:59.632040 kubelet[2928]: E1124 06:49:59.632016 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:49:59.692357 systemd-networkd[1508]: cali31b21cf93b9: Gained IPv6LL Nov 24 06:50:00.204324 systemd-networkd[1508]: cali5a6d345dc66: Gained IPv6LL Nov 24 06:50:00.455936 kubelet[2928]: E1124 06:50:00.455826 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:50:01.164324 systemd-networkd[1508]: calib7b8f1214be: Gained IPv6LL Nov 24 06:50:07.093281 containerd[1631]: time="2025-11-24T06:50:07.093209986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:50:07.408373 containerd[1631]: time="2025-11-24T06:50:07.408256930Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:07.417686 containerd[1631]: time="2025-11-24T06:50:07.417611104Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:50:07.417787 containerd[1631]: time="2025-11-24T06:50:07.417676104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:50:07.417817 kubelet[2928]: E1124 06:50:07.417782 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:50:07.418068 kubelet[2928]: E1124 06:50:07.417821 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:50:07.418068 kubelet[2928]: E1124 06:50:07.417908 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1dfd73395a3b493db78d35bb9e9b6696,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:07.420764 containerd[1631]: time="2025-11-24T06:50:07.420737231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:50:07.763992 containerd[1631]: time="2025-11-24T06:50:07.763954762Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:07.772999 containerd[1631]: time="2025-11-24T06:50:07.772861380Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:50:07.772999 containerd[1631]: time="2025-11-24T06:50:07.772888731Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:50:07.773190 kubelet[2928]: E1124 06:50:07.773081 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:50:07.773190 kubelet[2928]: E1124 06:50:07.773124 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:50:07.773280 kubelet[2928]: E1124 06:50:07.773248 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:07.774512 kubelet[2928]: E1124 06:50:07.774473 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:50:10.093359 containerd[1631]: time="2025-11-24T06:50:10.093285948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:50:10.461883 containerd[1631]: time="2025-11-24T06:50:10.461775604Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:10.462270 containerd[1631]: time="2025-11-24T06:50:10.462168556Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Nov 24 06:50:10.462270 containerd[1631]: time="2025-11-24T06:50:10.462236693Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Nov 24 06:50:10.462418 kubelet[2928]: E1124 06:50:10.462394 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:50:10.463810 kubelet[2928]: E1124 06:50:10.462426 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:50:10.463810 kubelet[2928]: E1124 06:50:10.462632 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqgvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7cdc8946dd-7dnrd_calico-system(a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:10.463969 containerd[1631]: time="2025-11-24T06:50:10.462601984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:50:10.464758 kubelet[2928]: E1124 06:50:10.464728 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:50:10.784077 containerd[1631]: time="2025-11-24T06:50:10.784044371Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:10.784502 containerd[1631]: time="2025-11-24T06:50:10.784472997Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:50:10.784544 containerd[1631]: time="2025-11-24T06:50:10.784484532Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:10.784722 kubelet[2928]: E1124 06:50:10.784640 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:10.784722 kubelet[2928]: E1124 06:50:10.784676 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:10.785194 kubelet[2928]: E1124 06:50:10.785032 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlhhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-7hjcp_calico-apiserver(f9c4aeec-8d56-49ab-910e-5dc9d27b3e29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:10.786243 containerd[1631]: time="2025-11-24T06:50:10.785183589Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:50:10.786654 kubelet[2928]: E1124 06:50:10.786636 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:50:11.108742 containerd[1631]: time="2025-11-24T06:50:11.108397708Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:11.114944 containerd[1631]: time="2025-11-24T06:50:11.114897610Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:50:11.115041 containerd[1631]: time="2025-11-24T06:50:11.114972815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:11.115145 kubelet[2928]: E1124 06:50:11.115085 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:50:11.115196 kubelet[2928]: E1124 06:50:11.115141 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:50:11.115393 kubelet[2928]: E1124 06:50:11.115342 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kd8l6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mxk2w_calico-system(57b14474-5edf-4409-a6bd-e5a9f7dc6f4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:11.115671 containerd[1631]: time="2025-11-24T06:50:11.115609156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:50:11.116750 kubelet[2928]: E1124 06:50:11.116722 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:50:11.442008 containerd[1631]: time="2025-11-24T06:50:11.441684866Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:11.442246 containerd[1631]: time="2025-11-24T06:50:11.442079803Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:50:11.442246 containerd[1631]: time="2025-11-24T06:50:11.442086421Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:50:11.442321 kubelet[2928]: E1124 06:50:11.442263 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:50:11.442321 kubelet[2928]: E1124 06:50:11.442300 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:50:11.442682 kubelet[2928]: E1124 06:50:11.442543 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:11.442789 containerd[1631]: time="2025-11-24T06:50:11.442716929Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:50:11.777490 containerd[1631]: time="2025-11-24T06:50:11.777428829Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:11.783360 containerd[1631]: time="2025-11-24T06:50:11.783332268Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:50:11.783425 containerd[1631]: time="2025-11-24T06:50:11.783390894Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:11.783498 kubelet[2928]: E1124 06:50:11.783474 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:11.784143 kubelet[2928]: E1124 06:50:11.783509 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:11.784143 kubelet[2928]: E1124 06:50:11.783697 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wmxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-x6nck_calico-apiserver(3ac853da-b498-4eb2-aacf-2ea6168a1205): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:11.784296 containerd[1631]: time="2025-11-24T06:50:11.783721419Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:50:11.785206 kubelet[2928]: E1124 06:50:11.785183 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:50:12.120460 containerd[1631]: time="2025-11-24T06:50:12.120374897Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:12.126869 containerd[1631]: time="2025-11-24T06:50:12.126839820Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:50:12.126970 containerd[1631]: time="2025-11-24T06:50:12.126884821Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:12.127112 kubelet[2928]: E1124 06:50:12.126951 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:12.127112 kubelet[2928]: E1124 06:50:12.126979 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:12.127187 kubelet[2928]: E1124 06:50:12.127143 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jrzb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-66994bd4cb-j58dd_calico-apiserver(54ddf3ce-7798-43d6-964a-ec131b6bd310): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:12.127621 containerd[1631]: time="2025-11-24T06:50:12.127429422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:50:12.128740 kubelet[2928]: E1124 06:50:12.128712 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:50:12.445186 containerd[1631]: time="2025-11-24T06:50:12.445105976Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:12.450974 containerd[1631]: time="2025-11-24T06:50:12.450943256Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:50:12.451142 containerd[1631]: time="2025-11-24T06:50:12.451003294Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:50:12.451192 kubelet[2928]: E1124 06:50:12.451090 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:50:12.451329 kubelet[2928]: E1124 06:50:12.451124 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:50:12.451427 kubelet[2928]: E1124 06:50:12.451399 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:12.452664 kubelet[2928]: E1124 06:50:12.452626 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:50:18.093122 kubelet[2928]: E1124 06:50:18.092822 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:50:22.093030 kubelet[2928]: E1124 06:50:22.092579 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:50:23.092480 kubelet[2928]: E1124 06:50:23.092435 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:50:23.092613 kubelet[2928]: E1124 06:50:23.092511 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:50:23.092613 kubelet[2928]: E1124 06:50:23.092558 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:50:24.094639 kubelet[2928]: E1124 06:50:24.094392 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:50:25.091984 kubelet[2928]: E1124 06:50:25.091936 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:50:31.093071 containerd[1631]: time="2025-11-24T06:50:31.093043797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:50:31.424044 containerd[1631]: time="2025-11-24T06:50:31.423804491Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:31.424474 containerd[1631]: time="2025-11-24T06:50:31.424415919Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:50:31.424474 containerd[1631]: time="2025-11-24T06:50:31.424440819Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:50:31.425180 kubelet[2928]: E1124 06:50:31.424616 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:50:31.425180 kubelet[2928]: E1124 06:50:31.424648 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:50:31.425180 kubelet[2928]: E1124 06:50:31.424739 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1dfd73395a3b493db78d35bb9e9b6696,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:31.426545 containerd[1631]: time="2025-11-24T06:50:31.426489871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:50:31.766488 containerd[1631]: time="2025-11-24T06:50:31.766459323Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:31.766820 containerd[1631]: time="2025-11-24T06:50:31.766798059Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:50:31.766880 containerd[1631]: time="2025-11-24T06:50:31.766866483Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:50:31.766999 kubelet[2928]: E1124 06:50:31.766956 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:50:31.767034 kubelet[2928]: E1124 06:50:31.767007 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:50:31.767114 kubelet[2928]: E1124 06:50:31.767089 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:31.769231 kubelet[2928]: E1124 06:50:31.768652 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:50:34.095393 containerd[1631]: time="2025-11-24T06:50:34.095285213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:50:34.338016 systemd[1]: Started sshd@7-139.178.70.102:22-147.75.109.163:37686.service - OpenSSH per-connection server daemon (147.75.109.163:37686). Nov 24 06:50:34.463215 containerd[1631]: time="2025-11-24T06:50:34.463181716Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:34.463773 containerd[1631]: time="2025-11-24T06:50:34.463675227Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:50:34.463773 containerd[1631]: time="2025-11-24T06:50:34.463752903Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:34.463951 kubelet[2928]: E1124 06:50:34.463925 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:34.464471 kubelet[2928]: E1124 06:50:34.464215 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:34.465127 containerd[1631]: time="2025-11-24T06:50:34.464711163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:50:34.465208 kubelet[2928]: E1124 06:50:34.465172 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlhhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-7hjcp_calico-apiserver(f9c4aeec-8d56-49ab-910e-5dc9d27b3e29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:34.466439 kubelet[2928]: E1124 06:50:34.466406 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:50:34.528789 sshd[5120]: Accepted publickey for core from 147.75.109.163 port 37686 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:50:34.561320 sshd-session[5120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:50:34.582258 systemd-logind[1602]: New session 10 of user core. Nov 24 06:50:34.591337 systemd[1]: Started session-10.scope - Session 10 of User core. Nov 24 06:50:34.790262 containerd[1631]: time="2025-11-24T06:50:34.790154530Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:34.795293 containerd[1631]: time="2025-11-24T06:50:34.791883336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:50:34.795293 containerd[1631]: time="2025-11-24T06:50:34.791935866Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:34.798481 kubelet[2928]: E1124 06:50:34.792036 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:50:34.798481 kubelet[2928]: E1124 06:50:34.792084 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:50:34.798481 kubelet[2928]: E1124 06:50:34.792175 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kd8l6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mxk2w_calico-system(57b14474-5edf-4409-a6bd-e5a9f7dc6f4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:34.798481 kubelet[2928]: E1124 06:50:34.793316 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:50:35.486298 sshd[5123]: Connection closed by 147.75.109.163 port 37686 Nov 24 06:50:35.486195 sshd-session[5120]: pam_unix(sshd:session): session closed for user core Nov 24 06:50:35.492501 systemd[1]: sshd@7-139.178.70.102:22-147.75.109.163:37686.service: Deactivated successfully. Nov 24 06:50:35.494373 systemd[1]: session-10.scope: Deactivated successfully. Nov 24 06:50:35.495410 systemd-logind[1602]: Session 10 logged out. Waiting for processes to exit. Nov 24 06:50:35.496738 systemd-logind[1602]: Removed session 10. Nov 24 06:50:37.093018 containerd[1631]: time="2025-11-24T06:50:37.092693832Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:50:37.430354 containerd[1631]: time="2025-11-24T06:50:37.430200790Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:37.431227 containerd[1631]: time="2025-11-24T06:50:37.431200666Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:50:37.431274 containerd[1631]: time="2025-11-24T06:50:37.431261170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:37.431426 kubelet[2928]: E1124 06:50:37.431393 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:37.431646 kubelet[2928]: E1124 06:50:37.431432 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:37.431766 kubelet[2928]: E1124 06:50:37.431719 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wmxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-x6nck_calico-apiserver(3ac853da-b498-4eb2-aacf-2ea6168a1205): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:37.431840 containerd[1631]: time="2025-11-24T06:50:37.431799161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:50:37.433145 kubelet[2928]: E1124 06:50:37.433123 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:50:37.760323 containerd[1631]: time="2025-11-24T06:50:37.760288915Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:37.762551 containerd[1631]: time="2025-11-24T06:50:37.762522971Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Nov 24 06:50:37.762615 containerd[1631]: time="2025-11-24T06:50:37.762576387Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Nov 24 06:50:37.764363 kubelet[2928]: E1124 06:50:37.764332 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:50:37.764425 kubelet[2928]: E1124 06:50:37.764366 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:50:37.764580 kubelet[2928]: E1124 06:50:37.764525 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqgvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7cdc8946dd-7dnrd_calico-system(a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:37.764990 containerd[1631]: time="2025-11-24T06:50:37.764810349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:50:37.766084 kubelet[2928]: E1124 06:50:37.766069 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:50:38.092270 containerd[1631]: time="2025-11-24T06:50:38.092177039Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:38.093648 containerd[1631]: time="2025-11-24T06:50:38.093598401Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:50:38.094234 containerd[1631]: time="2025-11-24T06:50:38.093865666Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:50:38.094274 kubelet[2928]: E1124 06:50:38.093947 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:38.094274 kubelet[2928]: E1124 06:50:38.093970 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:50:38.094274 kubelet[2928]: E1124 06:50:38.094079 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jrzb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-66994bd4cb-j58dd_calico-apiserver(54ddf3ce-7798-43d6-964a-ec131b6bd310): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:38.094799 containerd[1631]: time="2025-11-24T06:50:38.094527265Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:50:38.095309 kubelet[2928]: E1124 06:50:38.095294 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:50:38.405877 containerd[1631]: time="2025-11-24T06:50:38.405791305Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:38.408496 containerd[1631]: time="2025-11-24T06:50:38.408458683Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:50:38.408972 containerd[1631]: time="2025-11-24T06:50:38.408521284Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:50:38.409007 kubelet[2928]: E1124 06:50:38.408602 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:50:38.409007 kubelet[2928]: E1124 06:50:38.408634 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:50:38.409007 kubelet[2928]: E1124 06:50:38.408708 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:38.411598 containerd[1631]: time="2025-11-24T06:50:38.411573344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:50:38.745153 containerd[1631]: time="2025-11-24T06:50:38.745120083Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:50:38.750724 containerd[1631]: time="2025-11-24T06:50:38.750699007Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:50:38.751334 containerd[1631]: time="2025-11-24T06:50:38.750749526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:50:38.754210 kubelet[2928]: E1124 06:50:38.753912 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:50:38.754210 kubelet[2928]: E1124 06:50:38.753945 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:50:38.754210 kubelet[2928]: E1124 06:50:38.754013 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:50:38.755628 kubelet[2928]: E1124 06:50:38.755577 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:50:40.502678 systemd[1]: Started sshd@8-139.178.70.102:22-147.75.109.163:37698.service - OpenSSH per-connection server daemon (147.75.109.163:37698). Nov 24 06:50:40.582756 sshd[5138]: Accepted publickey for core from 147.75.109.163 port 37698 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:50:40.584734 sshd-session[5138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:50:40.588419 systemd-logind[1602]: New session 11 of user core. Nov 24 06:50:40.594440 systemd[1]: Started session-11.scope - Session 11 of User core. Nov 24 06:50:40.819122 sshd[5141]: Connection closed by 147.75.109.163 port 37698 Nov 24 06:50:40.819568 sshd-session[5138]: pam_unix(sshd:session): session closed for user core Nov 24 06:50:40.822996 systemd[1]: sshd@8-139.178.70.102:22-147.75.109.163:37698.service: Deactivated successfully. Nov 24 06:50:40.824721 systemd[1]: session-11.scope: Deactivated successfully. Nov 24 06:50:40.825511 systemd-logind[1602]: Session 11 logged out. Waiting for processes to exit. Nov 24 06:50:40.826767 systemd-logind[1602]: Removed session 11. Nov 24 06:50:45.092779 kubelet[2928]: E1124 06:50:45.092704 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:50:45.829723 systemd[1]: Started sshd@9-139.178.70.102:22-147.75.109.163:52728.service - OpenSSH per-connection server daemon (147.75.109.163:52728). Nov 24 06:50:45.930146 sshd[5153]: Accepted publickey for core from 147.75.109.163 port 52728 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:50:45.930966 sshd-session[5153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:50:45.933989 systemd-logind[1602]: New session 12 of user core. Nov 24 06:50:45.939376 systemd[1]: Started session-12.scope - Session 12 of User core. Nov 24 06:50:46.097692 sshd[5156]: Connection closed by 147.75.109.163 port 52728 Nov 24 06:50:46.098147 sshd-session[5153]: pam_unix(sshd:session): session closed for user core Nov 24 06:50:46.109921 systemd[1]: sshd@9-139.178.70.102:22-147.75.109.163:52728.service: Deactivated successfully. Nov 24 06:50:46.112494 systemd[1]: session-12.scope: Deactivated successfully. Nov 24 06:50:46.113848 systemd-logind[1602]: Session 12 logged out. Waiting for processes to exit. Nov 24 06:50:46.117644 systemd[1]: Started sshd@10-139.178.70.102:22-147.75.109.163:52732.service - OpenSSH per-connection server daemon (147.75.109.163:52732). Nov 24 06:50:46.121576 systemd-logind[1602]: Removed session 12. Nov 24 06:50:46.168271 sshd[5168]: Accepted publickey for core from 147.75.109.163 port 52732 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:50:46.169020 sshd-session[5168]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:50:46.172165 systemd-logind[1602]: New session 13 of user core. Nov 24 06:50:46.176309 systemd[1]: Started session-13.scope - Session 13 of User core. Nov 24 06:50:46.300952 sshd[5171]: Connection closed by 147.75.109.163 port 52732 Nov 24 06:50:46.301858 sshd-session[5168]: pam_unix(sshd:session): session closed for user core Nov 24 06:50:46.310288 systemd[1]: sshd@10-139.178.70.102:22-147.75.109.163:52732.service: Deactivated successfully. Nov 24 06:50:46.311756 systemd[1]: session-13.scope: Deactivated successfully. Nov 24 06:50:46.313542 systemd-logind[1602]: Session 13 logged out. Waiting for processes to exit. Nov 24 06:50:46.317381 systemd[1]: Started sshd@11-139.178.70.102:22-147.75.109.163:52740.service - OpenSSH per-connection server daemon (147.75.109.163:52740). Nov 24 06:50:46.320076 systemd-logind[1602]: Removed session 13. Nov 24 06:50:46.377751 sshd[5181]: Accepted publickey for core from 147.75.109.163 port 52740 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:50:46.378474 sshd-session[5181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:50:46.381991 systemd-logind[1602]: New session 14 of user core. Nov 24 06:50:46.386648 systemd[1]: Started session-14.scope - Session 14 of User core. Nov 24 06:50:46.480345 sshd[5184]: Connection closed by 147.75.109.163 port 52740 Nov 24 06:50:46.480760 sshd-session[5181]: pam_unix(sshd:session): session closed for user core Nov 24 06:50:46.483322 systemd[1]: sshd@11-139.178.70.102:22-147.75.109.163:52740.service: Deactivated successfully. Nov 24 06:50:46.484696 systemd[1]: session-14.scope: Deactivated successfully. Nov 24 06:50:46.485214 systemd-logind[1602]: Session 14 logged out. Waiting for processes to exit. Nov 24 06:50:46.486127 systemd-logind[1602]: Removed session 14. Nov 24 06:50:48.093901 kubelet[2928]: E1124 06:50:48.093871 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:50:49.092246 kubelet[2928]: E1124 06:50:49.092138 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:50:50.092133 kubelet[2928]: E1124 06:50:50.092088 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:50:51.490474 systemd[1]: Started sshd@12-139.178.70.102:22-147.75.109.163:34236.service - OpenSSH per-connection server daemon (147.75.109.163:34236). Nov 24 06:50:51.534877 sshd[5225]: Accepted publickey for core from 147.75.109.163 port 34236 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:50:51.535993 sshd-session[5225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:50:51.538610 systemd-logind[1602]: New session 15 of user core. Nov 24 06:50:51.548322 systemd[1]: Started session-15.scope - Session 15 of User core. Nov 24 06:50:51.977282 sshd[5228]: Connection closed by 147.75.109.163 port 34236 Nov 24 06:50:51.989672 sshd-session[5225]: pam_unix(sshd:session): session closed for user core Nov 24 06:50:52.004204 systemd[1]: sshd@12-139.178.70.102:22-147.75.109.163:34236.service: Deactivated successfully. Nov 24 06:50:52.006743 systemd[1]: session-15.scope: Deactivated successfully. Nov 24 06:50:52.011552 systemd-logind[1602]: Session 15 logged out. Waiting for processes to exit. Nov 24 06:50:52.012335 systemd-logind[1602]: Removed session 15. Nov 24 06:50:52.092984 kubelet[2928]: E1124 06:50:52.092960 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:50:52.094052 kubelet[2928]: E1124 06:50:52.093271 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:50:53.093578 kubelet[2928]: E1124 06:50:53.093504 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:50:56.992762 systemd[1]: Started sshd@13-139.178.70.102:22-147.75.109.163:34252.service - OpenSSH per-connection server daemon (147.75.109.163:34252). Nov 24 06:50:57.093109 kubelet[2928]: E1124 06:50:57.093059 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:50:57.168247 sshd[5245]: Accepted publickey for core from 147.75.109.163 port 34252 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:50:57.169409 sshd-session[5245]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:50:57.173426 systemd-logind[1602]: New session 16 of user core. Nov 24 06:50:57.180340 systemd[1]: Started session-16.scope - Session 16 of User core. Nov 24 06:50:57.323270 sshd[5248]: Connection closed by 147.75.109.163 port 34252 Nov 24 06:50:57.322119 sshd-session[5245]: pam_unix(sshd:session): session closed for user core Nov 24 06:50:57.326897 systemd[1]: sshd@13-139.178.70.102:22-147.75.109.163:34252.service: Deactivated successfully. Nov 24 06:50:57.328065 systemd[1]: session-16.scope: Deactivated successfully. Nov 24 06:50:57.330453 systemd-logind[1602]: Session 16 logged out. Waiting for processes to exit. Nov 24 06:50:57.331042 systemd-logind[1602]: Removed session 16. Nov 24 06:51:01.091908 kubelet[2928]: E1124 06:51:01.091799 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:51:02.093824 kubelet[2928]: E1124 06:51:02.093795 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:51:02.093824 kubelet[2928]: E1124 06:51:02.093818 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:51:02.333372 systemd[1]: Started sshd@14-139.178.70.102:22-147.75.109.163:40850.service - OpenSSH per-connection server daemon (147.75.109.163:40850). Nov 24 06:51:02.396297 sshd[5261]: Accepted publickey for core from 147.75.109.163 port 40850 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:02.397652 sshd-session[5261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:02.401681 systemd-logind[1602]: New session 17 of user core. Nov 24 06:51:02.408286 systemd[1]: Started session-17.scope - Session 17 of User core. Nov 24 06:51:02.571769 sshd[5264]: Connection closed by 147.75.109.163 port 40850 Nov 24 06:51:02.572181 sshd-session[5261]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:02.581233 systemd[1]: sshd@14-139.178.70.102:22-147.75.109.163:40850.service: Deactivated successfully. Nov 24 06:51:02.583564 systemd[1]: session-17.scope: Deactivated successfully. Nov 24 06:51:02.585330 systemd-logind[1602]: Session 17 logged out. Waiting for processes to exit. Nov 24 06:51:02.587935 systemd-logind[1602]: Removed session 17. Nov 24 06:51:02.589787 systemd[1]: Started sshd@15-139.178.70.102:22-147.75.109.163:40856.service - OpenSSH per-connection server daemon (147.75.109.163:40856). Nov 24 06:51:02.652155 sshd[5275]: Accepted publickey for core from 147.75.109.163 port 40856 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:02.653042 sshd-session[5275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:02.655926 systemd-logind[1602]: New session 18 of user core. Nov 24 06:51:02.661467 systemd[1]: Started session-18.scope - Session 18 of User core. Nov 24 06:51:03.092248 kubelet[2928]: E1124 06:51:03.092095 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:51:03.493349 sshd[5278]: Connection closed by 147.75.109.163 port 40856 Nov 24 06:51:03.493872 sshd-session[5275]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:03.501295 systemd[1]: sshd@15-139.178.70.102:22-147.75.109.163:40856.service: Deactivated successfully. Nov 24 06:51:03.502729 systemd[1]: session-18.scope: Deactivated successfully. Nov 24 06:51:03.503513 systemd-logind[1602]: Session 18 logged out. Waiting for processes to exit. Nov 24 06:51:03.505074 systemd-logind[1602]: Removed session 18. Nov 24 06:51:03.507718 systemd[1]: Started sshd@16-139.178.70.102:22-147.75.109.163:40872.service - OpenSSH per-connection server daemon (147.75.109.163:40872). Nov 24 06:51:03.597729 sshd[5288]: Accepted publickey for core from 147.75.109.163 port 40872 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:03.598669 sshd-session[5288]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:03.601994 systemd-logind[1602]: New session 19 of user core. Nov 24 06:51:03.609380 systemd[1]: Started session-19.scope - Session 19 of User core. Nov 24 06:51:04.406027 sshd[5291]: Connection closed by 147.75.109.163 port 40872 Nov 24 06:51:04.410279 sshd-session[5288]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:04.415514 systemd[1]: sshd@16-139.178.70.102:22-147.75.109.163:40872.service: Deactivated successfully. Nov 24 06:51:04.416787 systemd[1]: session-19.scope: Deactivated successfully. Nov 24 06:51:04.418955 systemd-logind[1602]: Session 19 logged out. Waiting for processes to exit. Nov 24 06:51:04.422980 systemd[1]: Started sshd@17-139.178.70.102:22-147.75.109.163:40886.service - OpenSSH per-connection server daemon (147.75.109.163:40886). Nov 24 06:51:04.423430 systemd-logind[1602]: Removed session 19. Nov 24 06:51:04.500241 sshd[5307]: Accepted publickey for core from 147.75.109.163 port 40886 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:04.502369 sshd-session[5307]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:04.507057 systemd-logind[1602]: New session 20 of user core. Nov 24 06:51:04.511378 systemd[1]: Started session-20.scope - Session 20 of User core. Nov 24 06:51:04.822533 sshd[5310]: Connection closed by 147.75.109.163 port 40886 Nov 24 06:51:04.821725 sshd-session[5307]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:04.828472 systemd[1]: sshd@17-139.178.70.102:22-147.75.109.163:40886.service: Deactivated successfully. Nov 24 06:51:04.830251 systemd[1]: session-20.scope: Deactivated successfully. Nov 24 06:51:04.830963 systemd-logind[1602]: Session 20 logged out. Waiting for processes to exit. Nov 24 06:51:04.833467 systemd[1]: Started sshd@18-139.178.70.102:22-147.75.109.163:40898.service - OpenSSH per-connection server daemon (147.75.109.163:40898). Nov 24 06:51:04.834834 systemd-logind[1602]: Removed session 20. Nov 24 06:51:04.887689 sshd[5322]: Accepted publickey for core from 147.75.109.163 port 40898 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:04.889567 sshd-session[5322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:04.895166 systemd-logind[1602]: New session 21 of user core. Nov 24 06:51:04.904371 systemd[1]: Started session-21.scope - Session 21 of User core. Nov 24 06:51:05.019062 sshd[5325]: Connection closed by 147.75.109.163 port 40898 Nov 24 06:51:05.019278 sshd-session[5322]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:05.021954 systemd-logind[1602]: Session 21 logged out. Waiting for processes to exit. Nov 24 06:51:05.022120 systemd[1]: sshd@18-139.178.70.102:22-147.75.109.163:40898.service: Deactivated successfully. Nov 24 06:51:05.023479 systemd[1]: session-21.scope: Deactivated successfully. Nov 24 06:51:05.024824 systemd-logind[1602]: Removed session 21. Nov 24 06:51:05.091749 kubelet[2928]: E1124 06:51:05.091669 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:51:08.121321 kubelet[2928]: E1124 06:51:08.121181 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:51:08.122412 kubelet[2928]: E1124 06:51:08.121598 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:51:10.030500 systemd[1]: Started sshd@19-139.178.70.102:22-147.75.109.163:40910.service - OpenSSH per-connection server daemon (147.75.109.163:40910). Nov 24 06:51:10.085672 sshd[5338]: Accepted publickey for core from 147.75.109.163 port 40910 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:10.087670 sshd-session[5338]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:10.095662 systemd-logind[1602]: New session 22 of user core. Nov 24 06:51:10.101410 systemd[1]: Started session-22.scope - Session 22 of User core. Nov 24 06:51:10.269650 sshd[5342]: Connection closed by 147.75.109.163 port 40910 Nov 24 06:51:10.270180 sshd-session[5338]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:10.274431 systemd[1]: sshd@19-139.178.70.102:22-147.75.109.163:40910.service: Deactivated successfully. Nov 24 06:51:10.276298 systemd[1]: session-22.scope: Deactivated successfully. Nov 24 06:51:10.277029 systemd-logind[1602]: Session 22 logged out. Waiting for processes to exit. Nov 24 06:51:10.278580 systemd-logind[1602]: Removed session 22. Nov 24 06:51:13.093047 kubelet[2928]: E1124 06:51:13.092982 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:51:14.233580 kubelet[2928]: E1124 06:51:14.232822 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:51:15.091546 kubelet[2928]: E1124 06:51:15.091504 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cdc8946dd-7dnrd" podUID="a2bb8cf5-1cc1-40de-90a6-0c1bddb1bf68" Nov 24 06:51:15.280895 systemd[1]: Started sshd@20-139.178.70.102:22-147.75.109.163:52420.service - OpenSSH per-connection server daemon (147.75.109.163:52420). Nov 24 06:51:15.324195 sshd[5363]: Accepted publickey for core from 147.75.109.163 port 52420 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:15.325175 sshd-session[5363]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:15.328721 systemd-logind[1602]: New session 23 of user core. Nov 24 06:51:15.337462 systemd[1]: Started session-23.scope - Session 23 of User core. Nov 24 06:51:15.464011 sshd[5366]: Connection closed by 147.75.109.163 port 52420 Nov 24 06:51:15.464300 sshd-session[5363]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:15.468016 systemd[1]: sshd@20-139.178.70.102:22-147.75.109.163:52420.service: Deactivated successfully. Nov 24 06:51:15.468169 systemd-logind[1602]: Session 23 logged out. Waiting for processes to exit. Nov 24 06:51:15.470318 systemd[1]: session-23.scope: Deactivated successfully. Nov 24 06:51:15.472216 systemd-logind[1602]: Removed session 23. Nov 24 06:51:17.138397 containerd[1631]: time="2025-11-24T06:51:17.138371442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:51:17.493789 containerd[1631]: time="2025-11-24T06:51:17.493618877Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:17.497833 containerd[1631]: time="2025-11-24T06:51:17.497812775Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:51:17.497929 containerd[1631]: time="2025-11-24T06:51:17.497875622Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:51:17.498076 kubelet[2928]: E1124 06:51:17.498043 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:51:17.514852 kubelet[2928]: E1124 06:51:17.514807 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:51:17.520526 kubelet[2928]: E1124 06:51:17.520479 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlhhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-7hjcp_calico-apiserver(f9c4aeec-8d56-49ab-910e-5dc9d27b3e29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:17.521652 kubelet[2928]: E1124 06:51:17.521627 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-7hjcp" podUID="f9c4aeec-8d56-49ab-910e-5dc9d27b3e29" Nov 24 06:51:20.092666 containerd[1631]: time="2025-11-24T06:51:20.092600775Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:51:20.386678 containerd[1631]: time="2025-11-24T06:51:20.386601537Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:20.389850 containerd[1631]: time="2025-11-24T06:51:20.389788061Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:51:20.389850 containerd[1631]: time="2025-11-24T06:51:20.389833293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:51:20.390230 kubelet[2928]: E1124 06:51:20.389996 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:51:20.390230 kubelet[2928]: E1124 06:51:20.390030 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:51:20.390230 kubelet[2928]: E1124 06:51:20.390106 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jrzb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-66994bd4cb-j58dd_calico-apiserver(54ddf3ce-7798-43d6-964a-ec131b6bd310): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:20.391252 kubelet[2928]: E1124 06:51:20.391215 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-66994bd4cb-j58dd" podUID="54ddf3ce-7798-43d6-964a-ec131b6bd310" Nov 24 06:51:20.473800 systemd[1]: Started sshd@21-139.178.70.102:22-147.75.109.163:52432.service - OpenSSH per-connection server daemon (147.75.109.163:52432). Nov 24 06:51:20.518477 sshd[5401]: Accepted publickey for core from 147.75.109.163 port 52432 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:20.519653 sshd-session[5401]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:20.524571 systemd-logind[1602]: New session 24 of user core. Nov 24 06:51:20.531613 systemd[1]: Started session-24.scope - Session 24 of User core. Nov 24 06:51:20.811801 sshd[5408]: Connection closed by 147.75.109.163 port 52432 Nov 24 06:51:20.817140 sshd-session[5401]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:20.827395 systemd[1]: sshd@21-139.178.70.102:22-147.75.109.163:52432.service: Deactivated successfully. Nov 24 06:51:20.828874 systemd[1]: session-24.scope: Deactivated successfully. Nov 24 06:51:20.830837 systemd-logind[1602]: Session 24 logged out. Waiting for processes to exit. Nov 24 06:51:20.833055 systemd-logind[1602]: Removed session 24. Nov 24 06:51:22.094330 containerd[1631]: time="2025-11-24T06:51:22.094069698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:51:22.446853 containerd[1631]: time="2025-11-24T06:51:22.446784241Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:22.447460 containerd[1631]: time="2025-11-24T06:51:22.447438609Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:51:22.447605 containerd[1631]: time="2025-11-24T06:51:22.447498807Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:51:22.447719 kubelet[2928]: E1124 06:51:22.447691 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:51:22.448069 kubelet[2928]: E1124 06:51:22.447724 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:51:22.448069 kubelet[2928]: E1124 06:51:22.447883 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:22.448543 containerd[1631]: time="2025-11-24T06:51:22.448192827Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:51:22.813513 containerd[1631]: time="2025-11-24T06:51:22.813368497Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:22.813752 containerd[1631]: time="2025-11-24T06:51:22.813734213Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:51:22.813833 containerd[1631]: time="2025-11-24T06:51:22.813764988Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:51:22.813967 kubelet[2928]: E1124 06:51:22.813941 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:51:22.814010 kubelet[2928]: E1124 06:51:22.813975 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:51:22.814158 kubelet[2928]: E1124 06:51:22.814132 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1dfd73395a3b493db78d35bb9e9b6696,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:22.814589 containerd[1631]: time="2025-11-24T06:51:22.814568613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:51:23.143799 containerd[1631]: time="2025-11-24T06:51:23.143713308Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:23.144601 containerd[1631]: time="2025-11-24T06:51:23.144575182Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:51:23.144642 containerd[1631]: time="2025-11-24T06:51:23.144588485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:51:23.144779 kubelet[2928]: E1124 06:51:23.144742 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:51:23.144820 kubelet[2928]: E1124 06:51:23.144787 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:51:23.145468 kubelet[2928]: E1124 06:51:23.144943 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s7x2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-n9f2t_calico-system(0c6abf64-6464-41f7-b11b-979ba6b72128): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:23.145615 containerd[1631]: time="2025-11-24T06:51:23.145153241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:51:23.146287 kubelet[2928]: E1124 06:51:23.146267 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-n9f2t" podUID="0c6abf64-6464-41f7-b11b-979ba6b72128" Nov 24 06:51:23.457447 containerd[1631]: time="2025-11-24T06:51:23.456930831Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:23.462833 containerd[1631]: time="2025-11-24T06:51:23.462799032Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:51:23.462974 containerd[1631]: time="2025-11-24T06:51:23.462867415Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:51:23.462999 kubelet[2928]: E1124 06:51:23.462964 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:51:23.463316 kubelet[2928]: E1124 06:51:23.463004 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:51:23.463316 kubelet[2928]: E1124 06:51:23.463074 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4svvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-bc589b8c4-9bnlt_calico-system(0c43478b-1cfb-4a98-8686-d4d93291e6b2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:23.464917 kubelet[2928]: E1124 06:51:23.464887 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-bc589b8c4-9bnlt" podUID="0c43478b-1cfb-4a98-8686-d4d93291e6b2" Nov 24 06:51:25.092493 containerd[1631]: time="2025-11-24T06:51:25.092464219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:51:25.423840 containerd[1631]: time="2025-11-24T06:51:25.423604144Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:25.423999 containerd[1631]: time="2025-11-24T06:51:25.423980364Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:51:25.424054 containerd[1631]: time="2025-11-24T06:51:25.424038578Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:51:25.424178 kubelet[2928]: E1124 06:51:25.424156 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:51:25.424515 kubelet[2928]: E1124 06:51:25.424398 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:51:25.424515 kubelet[2928]: E1124 06:51:25.424486 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wmxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7887855f8c-x6nck_calico-apiserver(3ac853da-b498-4eb2-aacf-2ea6168a1205): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:25.426307 kubelet[2928]: E1124 06:51:25.426264 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7887855f8c-x6nck" podUID="3ac853da-b498-4eb2-aacf-2ea6168a1205" Nov 24 06:51:25.821151 systemd[1]: Started sshd@22-139.178.70.102:22-147.75.109.163:55774.service - OpenSSH per-connection server daemon (147.75.109.163:55774). Nov 24 06:51:25.894403 sshd[5446]: Accepted publickey for core from 147.75.109.163 port 55774 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:51:25.895162 sshd-session[5446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:51:25.897936 systemd-logind[1602]: New session 25 of user core. Nov 24 06:51:25.905495 systemd[1]: Started session-25.scope - Session 25 of User core. Nov 24 06:51:26.017135 sshd[5449]: Connection closed by 147.75.109.163 port 55774 Nov 24 06:51:26.017519 sshd-session[5446]: pam_unix(sshd:session): session closed for user core Nov 24 06:51:26.019624 systemd-logind[1602]: Session 25 logged out. Waiting for processes to exit. Nov 24 06:51:26.020346 systemd[1]: sshd@22-139.178.70.102:22-147.75.109.163:55774.service: Deactivated successfully. Nov 24 06:51:26.022027 systemd[1]: session-25.scope: Deactivated successfully. Nov 24 06:51:26.022881 systemd-logind[1602]: Removed session 25. Nov 24 06:51:26.094789 containerd[1631]: time="2025-11-24T06:51:26.094588449Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:51:26.421187 containerd[1631]: time="2025-11-24T06:51:26.420962964Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:51:26.421510 containerd[1631]: time="2025-11-24T06:51:26.421452566Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:51:26.421510 containerd[1631]: time="2025-11-24T06:51:26.421488158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:51:26.421670 kubelet[2928]: E1124 06:51:26.421642 2928 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:51:26.421742 kubelet[2928]: E1124 06:51:26.421729 2928 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:51:26.424210 kubelet[2928]: E1124 06:51:26.423949 2928 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kd8l6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-mxk2w_calico-system(57b14474-5edf-4409-a6bd-e5a9f7dc6f4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:51:26.425632 kubelet[2928]: E1124 06:51:26.425586 2928 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-mxk2w" podUID="57b14474-5edf-4409-a6bd-e5a9f7dc6f4e" Nov 24 06:51:28.092714 containerd[1631]: time="2025-11-24T06:51:28.092409859Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:51:28.428530 containerd[1631]: time="2025-11-24T06:51:28.428373086Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io